You can now run Apache Spark DataFrame code on Snowflake. Join our deep dive into Snowpark Connect for Apache Spark, a new public preview.
Built on Spark Connect, this feature allows Apache Spark clients like PySpark to seamlessly connect to the Snowflake platform. You can execute all your modern Apache Spark DataFrame, Spark SQL, and user-defined function (UDF) code directly against your data (in Snowflake, Iceberg Tables, or external storage) using Snowpark execution. Say goodbye to the complexities of maintaining separate Apache Spark clusters—no more managing dependencies, wrestling with version compatibility, or planning upgrades.
In this webinar, we’ll go deep into the feature, how it works, and how to get started. What you’ll learn:
- The architecture: Understand how Snowpark Connect leverages Spark Connect to bridge your Apache Spark applications with the Snowflake engine.
- Developer experience: Watch live demos of Apache Spark code—including DataFrames, SQL, and UDFs—running directly on Snowflake’s managed compute.
- Check compatibility: Learn about the compatibility tooling in the SMA to find the best workloads and how to connect your existing Apache Spark DataFrame applications to Snowflake.
Speakers

Nimesh Bhagat
Senior Manager, Product Management
Snowflake

Shruti Anand
Senior Product Manager
Snowflake
Register Here