BUILD: The Dev Conference for AI & Apps (Nov. 12-14)

Hear the latest product announcements and push the limits of what can be done in the AI Data Cloud.

AI & ML

Snowflake Announces State-of-the-Art AI to Talk to your Data, Securely Customize LLMs and Streamline Model Operations

Snowflake Announces State-of-the-Art AI to Talk to your Data, Securely Customize LLMs and Streamline Model Operations

Generative AI presents enterprises with the opportunity to extract insights at scale from unstructured data sources, like documents, customer reviews and images. It also presents an opportunity to reimagine every customer and employee interaction with data to be done via conversational applications. These opportunities also come with challenges for data and AI teams, who must prioritize data security and privacy while rapidly deploying new use cases across the organization.

Meanwhile, machine learning (ML) remains valuable in established areas of predictive AI, like recommendation systems, demand forecasting and fraud prevention. But because the infrastructure requirements of gen AI differ from those of ML, even within the same team, data is duplicated across systems, leading to ungoverned data pipelines that add operational costs and heighten data risks.

Snowflake is committed to help enterprises continue to unlock new levels of productivity and insights in a way that is efficient, easy and trusted, regardless of whether the use case is using generative AI, machine learning or both. 

  • Easy: Enable more than just a few technical experts to use AI with fully managed services and infrastructure that just work and are accessible via code (SQL, Python, REST) and no-code interfaces
  • Efficient: Streamline the development-to-deployment lifecycle with top-quality models and services that run next to the enterprise data
  • Trusted: Expand governance and granular role-based access controls trusted by thousands of organizations from the data to the models without complexity

At Zoom Communications, our mission is to be one platform delivering limitless human connections. To accomplish this, we want to empower every team member to safely use AI to better serve our customers. Using Snowflake’s unified, easy-to-use and secure platform for generative AI and machine learning, we continue to democratize AI to efficiently turn data into better customer experiences.

Awinash Sinha, Corporate CIO, Zoom Communications

With this approach, here at Snowflake we continue to enable customers to bring AI to every part of their business and do things like:

  • Talk to your data using state-of-the-art AI: Asking questions against unstructured data, such as documents, and structured data, such as sales transactions, is now possible without complex infrastructure or custom LLM orchestration using two chat APIs. Cortex Analyst (public preview soon), built using Meta Llama 3 and Mistral Large models, allows developers to surface insights for business users with a service that turns text to answers from analytical tables in Snowflake and Cortex Search (public preview soon) to ask questions against documents and other unstructured text. 
  • Empower more teams to safely use AI: In addition to the easy-to-use SQL and Python functions used by data teams from inside Snowflake Notebooks (public preview), Snowflake is embracing more users via Snowflake AI & ML Studio (public preview for ML models, private preview for LLMs) for no-code AI development. To empower developers to quickly interact with LLMs in Snowflake Cortex AI from external apps, REST API support is coming, in addition to the existing SQL and Python functions. Business teams can use a simple natural language interface in Document AI (generally available soon) to extract content and analytical values from PDFs and other documents without needing AI expertise. 
  • Create use-case specific generative AI with ease: To help organizations enhance LLM performance and deliver more personalized experiences without operational complexity, serverless fine-tuning of foundation models is now in public preview. Fine-tuning can be executed both via no-code experience in Snowflake AI & ML Studio or using the code-based API.  
  • Expedite and scale feature and model operations: Developing, deploying and managing features and models at scale is getting easier. This is possible with Notebooks, which can be scheduled for automated execution (public preview); Snowflake Feature Store (public preview), to streamline use and management of features; and the Snowflake Model Registry, to manage models trained in Snowflake or any other platform. And to provide a unified experience for all custom models, including fine-tuned LLMs using Snowflake Cortex AI.

Talk to your data using state-of-the-art AI

Cortex Search, coming soon to public preview, makes talking to documents and other text-based data sets, such as wikis and FAQs, as easy as running a SQL function. Getting your document search service up and running only takes a few clicks using Studio or a single command, where you define what documents should be made available for searching (see Figure 1).

To provide more accurate results, Cortex Search uses state-of-the-art retrieval and ranking techniques. Using a combination of semantic and keyword search, built on the cutting-edge technology from Neeva and Arctic embed models built by the Snowflake AI Research team, Cortex Search is able to provide users with high-quality results without operational complexity. Cortex Search can scale to millions of documents with subsecond latency, using fully managed vector embedding and retrieval. There is no need to set up or manage a separate vector database.

Cortex Analyst, coming soon to public preview, allows app developers to create applications on top of analytical data stored in Snowflake, so business users can get the data insights they need by simply asking their questions in natural language. Compared to the Snowflake Copilot, which helps SQL developers accelerate development from Snowsight by turning text into SQL, the Cortex Analyst is designed to turn questions into answers and do so from any application that business users interact with on a daily basis. To deliver the high levels of accuracy that business users expect in order to trust and take action from the results, Snowflake handles the heavy lifting, combining state-of-the-art LLMs from Meta and Mistral AI, and only asks for developers to provide a semantic model during the setup. This semantic model helps provide additional context about the organization's specific terminology and data modeling structure.

Empower more teams to use AI

Snowflake AI & ML Studio, in private preview for LLMs, brings no-code, AI development to Snowflake. Studio is accessible within Snowsight to access interactive interfaces for teams to quickly combine multiple models with their data and compare results to accelerate deployment to applications in production. Some of the interactive experiences from Studio include the ability to build machine learning models, such as forecasts, and in the future will also include an interface to compare and evaluate responses of multiple LLMs from a single prompt, execute LLM fine-tuning and more. Especially for generative AI use cases, this no-code experience makes it easier for teams to evaluate and select the state-of-the-art model that best fits their task and cost goals.

Snowflake Notebooks help accelerate ML workflows. The seamless integration of experiment tracking with Weights & Biases directly within notebooks eliminates context switching and streamlines the entire machine learning lifecycle for building and deploying models. We're excited to see how this integration unlocks further efficiency gains for our customers.

Venky Yerneni, Manager, Solution Architecture, Weights & Biases

Snowflake Notebooks are now available in public preview as a way to empower data teams, proficient in SQL, Python or both, to run interactive analytics, train models or evaluate LLMs in an integrated cell-based environment. This interactive development experience eliminates the processing limits of local development as well as the security and operational risks of moving data to a separate tool. And since Notebooks come integrated with Streamlit libraries, it is easy to take code developed in a Notebook and deploy it in Streamlit in Snowflake to share insights as interactive applications. 

Learn more about how to use Notebooks from our documentation or this quickstart

Document AI is generally available soon and provides a new framework to easily extract content like invoice amounts or contract terms from documents using Arctic TILT, a state-of-the-art built-in, multimodal LLM. Nontechnical business users can use the natural language interface to define the set of fields or values that need to be extracted and if necessary, fine-tune the model to better understand specific document formats. To continuously extract the desired set of fields as new documents come in, the model can then be executed via SQL function PREDICT by the data or ML engineers that operationalize the model into pipelines that feed downstream analytics or ML models. 

Making custom generative AI secure and easy

Snowflake Cortex fine-tuning, now in public preview, supports secure and serverless customization of a subset of Meta and Mistral AI models. Once the data is ready, kicking off the fine-tuning process is as easy as running a SQL function or a few clicks via Snowflake AI & ML Studio, without having to manage any infrastructure. Fine-tuned models are only available to you by default and can be shared with others using policies defined using the Snowflake Model Registry. Users with access to the custom models will be able to use them just as easily as any other Cortex supported LLMs using the COMPLETE function in Cortex AI. 

For details on pricing and what models are supported, check out more details in our documentation

Cortex Guard is generally available soon for users to filter harmful content associated with violence and hate, self-harm and criminal activities. Safety controls can be effortlessly applied to any LLM in Cortex AI by using the guardrails setting that is now part of the COMPLETE function. Using Cortex Guard, enterprises are able to quickly implement safety controls necessary to deliver gen AI in production applications. 

Expedite and scale feature and model operations

Snowflake Horizon ML Lineage, in private preview, helps teams trace end-to-end lineage of features, data sets and models from data to insight for seamless reproducibility and simplified observability. To quickly map the relationship between data and models, a visualization interface will be made available inside Snowsight.

Feature Store, in public preview, is an integrated and centralized solution to define, manage, store and discover features. This solution, which falls under the broader suite of components of Snowflake ML, enables consistency and accuracy across different ML pipelines and teams, whether the features are used for training models or running inference. As part of this public preview, features can be defined and managed using the Snowpark ML API via Snowflake Notebooks or any IDE. The Feature Store automatically refreshes data incrementally from batch and streaming sources according to the user-defined schedule, providing teams with the most up-to-date information.

Learn more about how to get started from our documentation

Using Snowflake ML has enabled us to hit a big milestone in our data and AI vision and efficiently deliver true, one-to-one, personalized experiences for our customers. We've been able to achieve a 70% cost reduction and enhanced agility by moving from running hour-long inference jobs to predictions in near real-time.

Stefan Kochi, CTO, Paytronix

Model Registry is generally available and makes it easy to govern all ML models — whether you trained them in Snowflake or another ML system. By bringing models to Snowflake, teams get the benefit of having a centralized hub to manage models and their related metadata, and also get the benefit of more efficient inference by bringing models to run where the production features are stored and managed. 

To see how easy it is to train and govern ML models in Snowflake, check out this quickstart

Learn More and Resources

By having a unified platform with granular access controls for data and models, teams can accelerate the pace at which they bring AI into every part of their business in a way that is efficient, easy and trusted. 

Try it for yourself with our quickstarts for generative AI and ML

Embrace Generative AI and LLMs with the Snowflake Data Cloud

Share Article

Subscribe to our blog newsletter

Get the best, coolest and latest delivered to your inbox each week

Start your 30-DayFree Trial

Try Snowflake free for 30 days and experience the AI Data Cloud that helps eliminate the complexity, cost and constraints inherent with other solutions.