FREE Snowflake Virtual Dev Day '25

Skill up with AI and learn from visionaries Andrew Ng and Jared Kaplan on June 26.

Use Case

End-to-end machine learning

Accelerate machine learning from prototype to production with distributed GPUs or CPUs on the same platform as your governed data. Streamline model development and MLOps with no infrastructure to maintain or configure — all through a centralized UI.

Two women working together in an office
Latest product announcements for Snowflake ML

Snowflake Brings Agentic AI to Predictive ML

Overview

Piecing together many tools for ML workflows can be complex. Get models ready for production on one platform.

Develop, deploy and monitor ML features and models at scale with a fully integrated platform that brings together tools, workflows and compute infrastructure to the data.

Platform diagram

Integrate development and MLOps

Unify model pipelines end to end with any open source model on the same platform where your data lives.

AI icon

Scale models out of the box

Scale ML pipelines over CPUs or GPUs with built-in infrastructure optimizations — no manual tuning or configuration required.

Scale icon

Generate trusted ML insights

Discover, manage and govern features and models in Snowflake across the entire lifecycle.

ML Workflow

Accelerate the lifecycle of development to production with Snowflake ML

Model Development

Build scalable models on Snowflake data

Platform diagram
Platform diagram

Feature Management

Develop and manage features for production-grade pipelines

Create, manage and serve ML features with continuous, automated refresh on batch or streaming data using the Snowflake Feature Store. Promote discoverability, reuse and governance of features across training and inference.

Production

Deploy ML models built anywhere for low-latency inference

  • Log models built anywhere in Snowflake Model Registry and serve for real-time or batch predictions on Snowflake data with distributed GPUs or CPUs.  

  • Easily monitor performance and drift metrics with integrated ML Observability.
Platform diagram

Manufacturing

“Previously, the process to train all these models and generate predictions took a half hour. The unified model on Snowflake is super quick; we’re talking minutes to generate forecasts for hundreds of thousands of customers. This speed and simplicity will help unlock additional capabilities for the business like simulation and scenario forecasting.”

Dan Shah
Manager of Data Science

Read the story

A female engineer wearing a hard hat and high-visibility jacket is using a laptop in an industrial setting.

Feature overview

Learn more about the integrated features for development
and production in Snowflake ML

Get Started

Take the next stepwith Snowflake

Start your 30-day free Snowflake trial today

  • $400 in free usage to start
  • Immediate access to the latest Snowflake ML features
  • Build and deploy a model with CPUs or GPUs

End-to-end ML

FAQs

Yes, data scientists and ML engineers can build and deploy models with distributed processing in CPUs or GPUs. This is enabled by the underlying container-based infrastructure that powers the Snowflake ML platform. 

You can build features and models directly from Snowflake Notebooks, or through any IDE of choice with ML Jobs.

No, you can bring models built anywhere externally to run in production on Snowflake data. During inference, you can take advantage of integrated MLOps features such as ML observability and RBAC governance. 

Yes, Snowflake ML is fully compatible with any open-source library. Securely access to open source repositories via pip and bring in any model from hubs such as Hugging Face. 

Snowflake operates on a consumption-based pricing model with the latest credit pricing table here

Yes, you can try any of our ML quickstarts directly from the free trial experience.

Where Data Does More

  • 30-day free trial
  • No credit card required
  • Cancel anytime