Skip to content
Snowflake Inc.

Tips and Tricks to Migrate ML Workloads from Spark ML to Snowpark ML

On-Demand

Many organizations using PySpark for machine learning (ML) need to move data from storage to an external environment, which can be difficult to scale and costly to maintain. This is often complicated by the manual management and cluster tuning required in these legacy processing environments. To streamline end-to-end ML workflows on a single, governed platform, Snowflake ML is the integrated set of capabilities for model development and operations that can be accessed from the Python APIs in the Snowpark ML library.

Watch this session with Snowflake experts on migrating ML workloads from PySpark ML to Snowpark ML and learn about:

  • How to use Snowpark ML Modeling APIs and the Snowflake Model Registry to easily build and operationalize ML models 
  • Benefits from real customer examples of use cases built entirely inside Snowflake ML
  • Considerations and best practices for PySpark to Snowflake ML migrations
Speakers
Okhtay Azarmanesh

Principal Architect, Machine Learning Field CTO
Snowflake

Lucy Zhu

Product Marketing Manager
Snowflake

Watch Now 

  • Privacy Notice
  • Site Terms
  • Cookie Settings
  • Do Not Share My Personal Information

© 2025 Snowflake Inc. All Rights Reserved |  If you’d rather not receive future emails from Snowflake, unsubscribe here or customize your communication preferences