Product and Technology

Snowpark Container Services: Securely Deploy and run Sophisticated Generative AI and full-stack apps in Snowflake

Snowpark Container Services: Securely Deploy and run Sophisticated Generative AI and full-stack apps in Snowflake

Containers have emerged as the modern approach to package code in any language to ensure portability and consistency across environments, especially for sophisticated AI/ML models and full-stack data-intensive apps. These types of modern data products frequently deal with massive amounts of proprietary data. The complexity of creating, developing and running these workloads at scale is generally complex, forcing developers and data scientists to spend more time managing the compute and clusters for these applications rather than focusing on the business problem. Additionally, because full-stack apps and LLMs can’t run directly where data is governed, it makes managing the security of the data being used in any of those scenarios very hard and sometimes close to impossible. 

To make it easier to bring full-stack apps, LLMs and other sophisticated data products to the data securely, we are introducing Snowpark Container Services, now in private preview. This additional Snowpark runtime option enables developers to effortlessly deploy, manage, and scale containerized workloads (jobs, services, service functions) using secure Snowflake-managed infrastructure with configurable hardware options, such as GPUs. 

This new runtime eliminates the need for users to deal with complex operations of managing and maintaining compute and clusters for containers. With containers running in Snowflake, there is no need to move governed data outside of Snowflake (thereby exposing it to additional security risks) in order to use it as part of the most sophisticated AI/ML models and apps, whether developed internally or by third-party providers available as Snowflake Native Apps installable from the Snowflake Marketplace. 

Figure 1: With Snowpark Container Services you can run internally developed data products or install and run sophisticated third-party Snowflake Native Apps all within your Snowflake account.

Snowpark Container Services Overview

Programming language and hardware flexibility

The containers built and packaged by developers using their tools of choice, can include code in any programming language (e.g. C/C++, Node.js, Python, R, React etc.) and can be executed using configurable hardware options, including GPUs. This flexibility in programming language and hardware further expands the scope of AI/ML and App workloads brought to Snowflake data using Snowpark.

For example, data science teams can accelerate the execution of machine learning (ML) Python libraries used in training and inference jobs. They can also run computationally intensive generative AI such as large language models (LLMs). App developers can build and deploy front-end user interfaces using React and other popular web development frameworks. Data engineers can run optimized logic, typically written in C/C++, in the same processing engine that runs SQL or Python DataFrame operations.

When we first launched Snowpark in 2020, Snowflake introduced a secure way to deploy and process Python, Java, and Scala code in Snowflake through a set of libraries and runtimes including User Defined Functions (UDFs) and Stored Procedures. Snowpark Container Services brings one more way to bring to life the Snowpark vision to offer a trusted, powerful, and familiar way for developers to effortlessly process any non-SQL code inside Snowflake’s governed data boundary. As with the rest of Snowpark, the code and logic can be accessed and integrated from any Snowflake experience. Users are able to write data queries in SQL, Python, Java, or Scala that process data through containers running in Snowpark Container Services.

Fully managed and unified services

Snowpark Container Services provides a simple, unified experience for the end-to-end lifecycle of containerized applications and AI/ML models. Other solutions require you to manually stitch together a container registry, container management service, and compute service, plus require you to manage your own separate tools for observability, data connectivity, security etc. 

Staying true to Snowflake’s philosophy to ensure the platform “just works”, Snowpark Container Services brings all of them together, and removes the need to deal with compute and cluster maintenance to speed up development and productization of data applications. In doing so, Snowpark Container Services, brings to container hosting and serving simplicity and power for developers that only need to provide their containers, and Snowflake will serve and host them at scale without having to learn all the complexities of Kubernetes. 

As part of supporting broader access to containers, developers have the option to use SQL, CLI or Python as the programming interface. To support a wide range of workloads, Snowpark Container Services has three execution options:

  • Jobs: Time-bounded processes triggered either manually on an ad-hoc basis or as part of a scheduled process. Common examples include container images used to kick-off ML training on GPUs or a step in a data pipeline which may be running any language, framework or library inside a container. 
  • Service Functions: Time-bounded processes that take input, perform action and can be repeatedly triggered by an event backed by your containers. 
  • Services: Long-living processes with secured ingress-endpoints that typically host an application front-end or an API and must always be available for on-demand requests. 
Figure 2: Option to use Python API, SQL or CLI (not shown) to create a service running in Snowflake and accessible over https

Bring sophisticated third-party software and apps to the data

Snowpark Container Services can be used as part of a Snowflake Native App to enable developers to distribute sophisticated apps that run entirely in their end-customer’s Snowflake account. For Snowflake consumers, this means they can securely install and run cutting-edge products like hosted notebooks and LLMs inside their Snowflake account in a way that protects the provider’s proprietary IP. 

  • Hex. Data science teams can accelerate their analytics and data science using Snowpark Container Services to deploy Hex, a platform for collaborative analytics and data science, to query and process their data with SQL, R, and Python, as well as run machine learning training on GPUs, all while ensuring their data never leaves the consumer’s Snowflake account. Check out the demo video and sign up for the waitlist to deploy Hex in Snowflake. 
  • LLM Providers. Organizations can bring cutting-edge generative AI models inside their Snowflake account as Snowflake Native Apps where both provider’s Intellectual Property (IP) and consumer’s data is fully protected. For those using a provider such as AI21 Labs, Reka or the NVIDIA (NeMo framework, part of the NVIDIA AI Enterprise software platform) LLM weights or other proprietary IP are not exposed to the app consumer, as the logic in the Snowflake Native App is not accessible. Because the LLM is running inside the end-consumer’s account, governed enterprise data used for fine-tuning or other interaction with the LLM is never exposed back to the developer. See this in action in the demo from Snowflake Summit 2023 keynote. 

This new Snowpark runtime, further expands and strengthens the Snowpark Accelerated partner ecosystem. It leverages Snowflake’s powerful platform to enhance the experience of technical and business teams through our partners’ integrated services and applications that bring processing to the governed data.

  • Alteryx. Create a workflow in Alteryx Designer choosing from 160+ no-code, code-friendly tools to solve an advanced analytics problem such as building a predictive model to forecast demand. Upload the workflow to the Alteryx Analytics Cloud Platform and execute the logic in Snowpark Container Services.
  • Astronomer. Use Apache Airflow for orchestrating complex and powerful data pipelines—from advanced scheduling, alerting, monitoring, complex branching and retry logic, and more—more efficiently by bringing code to data without dependency management and specialized compute restrictions.
  • Dataiku. Deploy models trained in Dataiku to Snowpark Container Services as a scalable inference service. With a web application, Dataiku can feed features into the model, and receive predictions back from Snowpark Container Services immediately.
  • NVIDIA. NVIDIA GPUs will serve as the accelerated infrastructure layer for Snowpark Container Services, that will power more efficient, performant, and cost-effective compute for Snowflake customers.
  • SAS. Use SAS® Viya® for model development and decision training with data in Snowflake by publishing the champion model to Snowpark Container Services through SAS Container Runtime—an OCI compliant container that enables rapid prototyping and easy deployment. 

There are many more integrated partner solutions with Snowpark Container Services including purpose-built databases (RelationalAI, Pinecone, CARTO), ML and MLOps (H2O.ai, Kumo AI, Weights & Biases), and entire applications (Amplitude). 

The Future of Snowpark Container Services 

Snowpark Container Services sets the stage for a new era of app development with no limits to what you can bring to Snowflake. But there is much more we have planned for this new Snowpark runtime, such as delivering on the vision of bringing Snowflake Native Apps with container images via Snowflake Marketplace.

To stay in the loop about Snowpark Container Services, be sure to sign up here

Powered by Snowflake demo: Using Snowflake and Generative AI to Rapidly Build Features

Share Article

Start your 30-DayFree Trial

Try Snowflake free for 30 days and experience the AI Data Cloud that helps eliminate the complexity, cost and constraints inherent with other solutions. 


Copyright 2022, Snowflake Site. All rights reserved.