Accelerate Your Apache Spark Workloads on Snowflake
Demo

How to Accelerate Your Apache Spark Workloads on Snowflake

Hands-on lab featuring real world Insights from Toyota

On-Demand

Register Now

As data engineering teams build for AI, performance and total cost of ownership become critical. Apache Spark remains essential for transforming data, but operating and scaling Spark infrastructure often becomes the bottleneck.

In this session, we’ll explore how you can run your existing Spark workloads directly on the Snowflake vectorized engine, reducing infrastructure overhead and improving performance. You’ll also hear from Toyota on how they are running Spark workloads more efficiently.

You’ll learn:

  • How to run existing PySpark workloads (including Spark DataFrames and UDFs) on Snowflake with minimal migration

  • How Spark workloads execute on Snowflake’s elastic compute via Snowpark Connect, and why this execution improves performance and total cost of ownership

  • How Toyota consolidated their Spark workloads, reducing operational complexity and costs.

How to Accelerate Your Apache Spark Workloads on Snowflake

Toyota

Speakers

Person alt text
Ash UbraniSr. Product Marketing Manager, Snowflake
Person alt text
Vino DuraisamySr. Developer Advocate, Snowflake
Person alt text
Devin PetersohnStaff Software Engineer, Snowflake
Person alt text
Nour QwederSenior Data Scientist, R&D, Toyota Material Handling