Virtual Hands-On Lab

Build Data Engineering Pipelines Using Snowpark in Snowflake Notebooks

15APR

Register Now

Get started building data engineering pipelines using Snowpark in Snowflake Notebooks. With the latest Notebooks experience now available in Snowflake Workspaces, you’ll be working in a more complete, IDE-style environment with built-in organization, Git integration, terminal access, and AI-assisted development through Cortex Code. This makes it easier to develop, manage, and productionize your pipelines alongside SQL scripts, dbt projects, and Python code in one place.
In this session, you’ll learn how to build a pipeline with Python in Snowflake and manage and deploy it through an automated CI/CD process. We’ll walk through how to:

  • Ingest custom file formats (like Excel) with Snowpark
  • Access data from Snowflake Marketplace and use it for your analysis
  • Build data engineering pipelines with Snowflake Notebooks and the Snowpark DataFrame API
  • Add logging to your Python code and monitor from within Snowsight
  • Programmatically interact with Snowflake objects via the Snowflake Python Management API
  • Use the Python Task DAG API to manage Snowflake Tasks
  • Leverage Cortex Code to accelerate data engineering development
  • Build CI/CD pipelines using the Snowflake CLI and GitHub Actions

Speakers

Person alt text
Jeremiah HansenPrincipal Data Platform Architect, Snowflake