Many organizations using PySpark for machine learning (ML) need to move data from storage to an external environment, which can be difficult to scale and costly to maintain. This is often complicated by the manual management and cluster tuning required in these legacy processing environments. To streamline end-to-end ML workflows on a single, governed platform, Snowflake ML is the integrated set of capabilities for model development and operations that can be accessed from the Python APIs in the Snowpark ML library.
Watch this session with Snowflake experts on migrating ML workloads from PySpark ML to Snowpark ML and learn about:
- How to use Snowpark ML Modeling APIs and the Snowflake Model Registry to easily build and operationalize ML models
- Benefits from real customer examples of use cases built entirely inside Snowflake ML
- Considerations and best practices for PySpark to Snowflake ML migrations
Speakers
Okhtay Azarmanesh
Principal Architect, Machine Learning Field CTO
Snowflake
Lucy Zhu
Product Marketing Manager
Snowflake