To accelerate delivery of applications using generative AI, Snowflake Cortex AI provides LLMs and chat with your data services that are easily and securely accessible via serverless functions.
To build those AI applications, there may be times where custom code development and deployment is needed. Snowpark offers flexible runtimes for Python, and with a fully managed container service, it is easy to build end-to-end AI applications without complex integrations to separate infrastructure. Join us at Snowpark Day to learn how to use:
- Snowpark Python to preprocess and chunk documents for Retrieval Augmented Generation architectures
- Snowpark External Access to securely connect with third-party LLMs and other AI services
- Snowpark Container Services to deploy any OSS LLM, vector store or custom apps from Snowflake Marketplace
And hear directly from our Snowflake Native Apps partners on how they used Snowpark Container Services to:
- Deploy an entire geospatial analytics platform through Native Apps + Containers and build maps with CARTO within your infrastructure
- Use predictive AI to build highly accurate ML models at scale using Kumo’s Snowflake Native App
In partnership with:
Speakers
Julian Forero
Senior Manager, Product Marketing
Snowflake
Emily Dillon
Senior Manager, Product Marketing
Snowflake
Muzz Imam
Senior Product Manager
Snowflake
Jamie Sanchez
Technical Lead
CARTO
Alan Krumholz
Senior Engineer for AI/ML
Kumo
Disha Dubey
Senior Engineer for AI/ML
Kumo