Skip to content
Languages
  • English
  • Français
  • Deutsch
  • Italiano
  • 日本語
  • 한국어
  • Português
  • Español
Contact Us Start for Free
  • SNOWFLAKE DATA HEROES COMMUNITY OUTAGE
Contact Us
  • English
  • Français
  • Deutsch
  • Italiano
  • 日本語
  • 한국어
  • Português
  • Español

WORKLOAD SERIES – PART II

SNOWFLAKE DISCOVER

DATA ENGINEERING + SCIENCE/ML (SNOWPARK)

One Platform. Multiple Workloads.
2 Workloads – 12 Sessions – Live – Virtual
May 10 & 11

Register – AEST | SGT | HKT – Time Friendly     Register – IST | SGT | HKT – Time Friendly           

WHAT TO EXPECT

Snowflake Discover – Workload Series (Part 2) will deliver a deep-dive into Snowflake Data Engineering and Science/ML with Snowpark. Discover is a live, virtual enablement event designed to share and explore what is possible with The Data Cloud. This edition will develop your expertise and allow you to utilise Data Engineering and Science/ML with SnowPark via 30-minute what/why and 60-minute step-by-step technical sessions. Built for technical professionals at different levels of expertise, the event will deliver an end-end immersive experience via 6 subject areas, 12 unique sessions and live Q&A.
 

WHAT YOU WILL EXPLORE & LEARN

  • The A-Z of Data Engineering + Science/ML with Snowflake
  • Simplifying Data Architecture – Concurrency, Siloed/Diverse Data
  • Snowpark to Simplify, Automate Pipelines with Governance/Security
  • Using Snowpark Python Stored Procedures – Processing to Deployment
  • Streaming Data with Snowflake – Key Concepts and Application Scenarios
  • Snowpipe Streaming Utilising Serverless APIs
  • Snowflake for ML/AI Use Cases – Access to Data Science Models
  • Python Natively on Snowpark – Implementing end-to-end ML Workflows
  • Feature Store on Snowflake – Build and Deployment Architectures
  • Building a Feature Store using Feast – Set-up & Integration

 

WHO SHOULD ATTEND

Snowflake Discover have been designed for individuals with limited to extensive experience using Snowflake who are looking to get-started of expanding their skill and confidence with the Data Cloud – including IT Managers, Data Architects, Engineers, Scientists, Administrators and Developers.

REGISTER

AEST | NZST | SGT TIME FRIENDLY

To register for IST | SGT Time Friendly click here

JOIN US – SNOWFLAKE DISCOVER

 

CONTENT LEVEL GUIDE

Level 100 - Introductory

Sessions provide a step-by-step overview of the topic, designed for professionals that are new to the subject area

Level 200 - Intermediate

Sessions provide technical deployment best-practice including demos, designed for professional that have an introductory understanding of the subject area

Level 300 - Advanced

Technical step-step deep-dive designed for professionals familiar with the subject area, with or without direct implementation experience

AGENDA – AEST | NZST | SGT | HKT – TIME FRIENDLY

For India (IST) | ASEAN (SGT) – Time Friendly – please click here.

                            WEDNESDAY 10 MAY
Introduction to Data Engineering with Snowflake

10:00 AM AEST | 12:00 PM NZST

L100 - A-Z Data Engineering Workload on Snowflake
Session Descriptor

Data and analytics practitioners struggle with siloed and diverse data, performance degradation over time, exacerbated by operating and maintaining complex pipelines and architecture. In this session we will walk you through the Data Engineering workload on Snowflake to improve the reliability of your pipelines, understand how you can speed data transformation using streams and tasks, get granular control over data accessibility and security challenges, and much more.

10:30 AM AEST | 12:30 PM NZST

L100 - Zero to Hero with the Data Engineering Workload on Snowflake
Session Descriptor

Now that you have learned all about “what” the data engineering workload on Snowflake is all about, it’s time to learn “how”. In this session we are going to give you practical tips on improving concurrency handling, working with siloed and diverse data and simplifying pipeline architecture.

Using Snowpark & Python for Advanced Data Engineering

11:30 AM AEST | 1:30 PM NZST

L100 - Introducing concepts of Snowpark for Data Engineering
Session Descriptor

Data engineering is both broad and tricky. Going all the way from data ingestion, transformation and consumption; data practitioners often need to incorporate DevOps best practices to make the best out of their data engineering pipelines. In this session, we’ll explain how with Snowpark you can simplify, speed up and even automate some aspects of your pipelines without having to worry about governance and security.

12:00 PM AEST | 2:00 PM NZST

L200 - Implementing Data Engineering using Snowpark
Session Descriptor

By the end of this session you would have built a robust and reliable data engineering pipeline using Snowpark Python stored procedures. This pipeline will process data incrementally, be orchestrated via Snowflake tasks, and be deployed via a CI/CD pipeline.

Streaming data processing on Snowflake

1:00 PM AEST | 3:00 PM NZST

L300 - Go with the Flow - Streaming Data Processing introduction on Snowflake
Session Descriptor

Ingesting and processing real time or near real time data is a requirement for many data teams to enable quick decision making. In this session we introduce how to handle streaming data with Snowflake. We will look at key concepts and common application scenarios.

1:30 PM AEST | 3:30 PM NZST

L300 - Row the Boat - Setup and implementation of Streaming Data Processing on Snowflake
Session Descriptor

It’s time to set up a pipeline and get to river of data flowing! In this session we will show you how Snowpipe streaming loads continuously generated rowset data into Snowflake utilising a serverless API.

                            THURSDAY 11 MAY
Data Science with Snowflake

10:00 AM AEST | 12:00 PM NZST

L100 - A-Z Data Science on Snowflake

Session Descriptor

As a data strategist, data scientist and ML engineer, if you are wondering how you could leverage your Snowflake platform for ML/AI use cases, look no further. In this session, you will learn how Snowflake Data Cloud supports Data Science workload from data collection, exploration, processing to model training, deployment and monitoring. You would also get introduced to some of the other ML capabilities within Snowflake, more targeted towards non-expert data science personas such as analyst and business users and learn how Snowflake is enabling access to the data science models easy for them.

10:30 AM AEST | 12:30 PM NZST

L100 - Practical Data Science - Right on Snowflake

Session Descriptor

In this session, you will learn how to do end-to-end actionable data science going all the way from accessing raw data within Snowflake and create training dataset to building a model and deploying it for inference purposes within Snowflake using Snowpark for Python. You would also see a simple interactive live application running on top of this model deployed on Snowflake.

Data Science using Snowpark & Python

11:30 AM AEST | 1:30 PM NZST

L200 - Walk in the Park: Doing Data Science on Snowflake using Snowpark
Session Descriptor

In this session, you will see a deep dive into Snowflake Snowpark project, specifically running Python natively on Snowflake. You will learn how Python could be used to implement an end-to-end Machine Learning workflow.

12:00 PM AEST | 2:00 PM NZST

L200 - Run in the Park: Practically use Python and Snowpark for Data Science in Snowflake
Session Descriptor

You will learn how an end-to-end Machine Learning job could be created using Snowpark for Python, all within Snowflake. You will learn how you could deploy the trained model to support inference of data volumes ranging from thousands of rows to millions of rows and more.

Building Feature Store using Feast on Snowflake

1:00 PM AEST | 3:00 PM NZST

L300 - Importance of Features Stores for Data Science
Session Descriptor

In this session, you will get introduced to the concept and importance of features and feature engineering. You will learn some of the options that are available to build a feature store on Snowflake and you will learn a few deployment architectures to build feature stores on Snowflake

1:30 PM AEST | 3:30 PM NZST

L300 - Setup, Implement and Use a Feast Feature Store for Data Science
Session Descriptor

In this session, we will focus on building a feature store using an open source project called “Feast”. You will learn how to setup and integrate Feast with Snowflake. You will then see an end-to-end ML workflow with Feast integrated within the model training and inference activities.

PRESENTERS + Q&A EXPERTS

Regan is a Senior Sales Engineering Manager at Snowflake, The Data Cloud. Regan built his first Data Warehouse over twenty years ago, and has worked with many customers across the globe to architect and implement streaming, big data, and modern data warehouse solutions both on-premises and in the cloud. Prior to Snowflake, Regan spent several years with Microsoft, with the latter spent working on modern data architectures with some of Microsoft’s largest customers across the globe.

LinkedIn: https://www.linkedin.com/in/rmurphy/

Regan Murphy

Principal Sales Engineer

Snowflake

Regan Murphy

Principal Sales Engineer

Snowflake

Connect with me on LinkedIn

Adrian is a Principal Sales Engineer based in Singapore working with enterprise customers in the financial services sector as well as high tech ‘digital natives’ across ASEAN. Adrian diligently works with customers to ensure that they see success in Snowflake’s Data Cloud from the initial architecture and design all the way to their go-live.

Adrian likes to combine DevOps practices with data and is recognised for the technical materials he produces integrating DevOps tooling and practices with Snowflake. Adrian is also highly passionate about learning and sharing all things data and cloud and has expanded his knowledge to a variety of solution stacks including Redshift, BigQuery, Redis and Kafka.

Adrian Lee

Manager, Solution Engineering

Snowflake

Adrian Lee

Manager, Solution Engineering

Snowflake

Connect with me on LinkedIn

Experienced data professional with strong technical skills from analysis, consultancy, architecture and data science. Sarita has a demonstrated history of driving engagement and projects in the Financial Services and eCommerce industries. A detail-oriented specialist with outgoing personality, motivated by the potential of technology to drive changes to people’s lives.

Day to day work involves stakeholder management, problem definition and framing, solution design, data transformation and preparation, and occasionally some machine learning. Interested in domain of Data Science, Data Engineering, Architecture and Solutions.

Sarita Priyadarshini

Senior Sales Engineer

Snowflake

Sarita Priyadarshini

Senior Sales Engineer

Snowflake

Connect with me on LinkedIn

Experienced Data Science professional with strong technical skills. Demonstrated history of driving engagement and projects in Financial Services, eCommerce industry. Detail-oriented specialist with outgoing personality. Motivated by the potential of technology to drive changes to people’s lives.

Day to day work involves stakeholder management, problem definition and framing, solution design, data transformation and preparation, and occasionally some machine learning.
Interested in domain of Data Science, Data Engineering, Architecture and Solutions

Changboon Heng

Senior Sales Engineer

Snowflake

Changboon Heng

Senior Sales Engineer

Snowflake

Connect with me on LinkedIn

Experienced Data Architect with expertise designing and building Big data platforms, Data pipelining for Advanced analytics/Data Science & BI uses cases, MLops, Dataops and Data Warehousing.

Prashant Yadav

Principal Architect, Data Engineering, Field CTO Office

Snowflake

Prashant Yadav

Principal Architect, Data Engineering, Field CTO Office

Snowflake

Connect with me on LinkedIn

Marko is a Principal Sales Engineer at Snowflake who brings years of experience working with customers across multiple disciplines including data integration, data platforms and data science. With industry experience spanning the telecommunications, banking, and retail sectors Marko now assists clients in their journey as data driven businesses.

Marko Slabak

Principal Sales Engineer

Snowflake

Marko Slabak

Principal Sales Engineer

Snowflake

Connect with me on LinkedIn

Tim Buchhorn

Field CTO, APJ

Snowflake

Tim Buchhorn

Field CTO, APJ

Snowflake

Connect with me on LinkedIn

Majid Miri

Senior Sales Engineer

Snowflake

Majid Miri

Senior Sales Engineer

Snowflake

Stephen Ermann

Senior Sales Engineer

Snowflake

Stephen Ermann

Senior Sales Engineer

Snowflake

Connect with me on LinkedIn

Rishu has 18+ years of Information Technology experience in the solution design and architecture, strategic design of future data platforms and roadmaps, predominantly focused on data and Information Management – particularly cloud native data platforms and technologies, big data, analytics, Machine Learning, Business Intelligence capabilities, Data Management and Artificial Intelligence.

Rishu Saxena

Principal Architect, Machine Learning, Field CTO Office

Snowflake

Rishu Saxena

Principal Architect, Machine Learning, Field CTO Office

Snowflake

Wasim El-Omari is a Cyber Security Field CTO at Snowflake, he is based in Melbourne Australia, working in a Global Team focusing primarily on the APJ region. Wasim has more than 20 years of experience in Cyber Security. He has a background in Security Operations, Incident Response, Digital Forensics and Cloud Security. He has Bachelor’s Degree in Electrical Engineering in Telecommunications and a Master’s Degree in Information Security with a focused interest in Data Governance, Risk Management & Incident Handling.

His career has taken him globally to different regions and has worked with Fortune 100 companies and Federal Agencies to design, implement and validate effective security programs. He is focused on the Data Cloud at Snowflake and the security that it is intrinsically built for addressing digitization challenges. He is passionate about being able to help customers address their digital challenges in the Snowflake Data Cloud.

LinkedIn: https://www.linkedin.com/in/wasim-elomari/

Wasim El-Omari

Principal Architect, Security, Field CTO Office

Snowflake

Wasim El-Omari

Principal Architect, Security, Field CTO Office

Snowflake

Connect with me on LinkedIn

Ron Dunn is a Principal SE with Snowflake. He has more than 35 years of experience in data warehousing and analytics, across multiple countries, industries and platforms. Experienced in multiple data warehouse methodologies and architectures, Ron’s main interests are data engineering and performance.

Ron Dunn

Principal Architect, Field CTO Office

Snowflake

Ron Dunn

Principal Architect, Field CTO Office

Snowflake

 

REGISTER NOW  SHARE INVITE BY EMAIL  SHARE INVITE BY LINKEDIN

CUSTOMER STORIES

Discover how our customers are using Snowflake to unlock the value of their data.

Arrow Image
Snowflake Inc.
  • Platform
    • Cloud Data Platform
    • Pricing
    • Marketplace
    • Security & Trust
  • Solutions
    • Snowflake for Financial Services
    • Snowflake for Advertising, Media, & Entertainment
    • Snowflake for Retail & CPG
    • Healthcare & Life Sciences Data Cloud
    • Snowflake for Marketing Analytics
  • Resources
    • Resource Library
    • Webinars
    • Documentation
    • Community
    • Procurement
    • Legal
  • Explore
    • Blog
    • Trending
    • Guides
    • Developers
  • About
    • About Snowflake
    • Investor Relations
    • Leadership & Board
    • Snowflake Ventures
    • Careers
    • Contact

Sign up for Snowflake Communications

Thanks for signing up!

  • Privacy Notice
  • Site Terms
  • Cookie Settings
  • Do Not Share My Personal Information

© 2025 Snowflake Inc. All Rights Reserved |  If you’d rather not receive future emails from Snowflake, unsubscribe here or customize your communication preferences