To accelerate delivery of applications using generative AI Snowflake Cortex AI provides LLMs and chat with your data services that are easily and securely accessible via serverless functions.
To build those AI applications, there may be times where custom code development and deployment is needed. With Snowpark flexible runtimes for Python and a fully managed container service, it is easy to build end-to-end AI applications without complex integrations to separate infrastructure.
Join us at Snowpark Day to:
- Learn how to use Snowpark Container Services to deploy any OSS LLM, vector store or custom apps from Snowflake Marketplace
- Watch a demo of Snowpark Container Services to run a generative AI inpainting model within Snowflake.
And hear directly from our Snowflake Native Apps partners on how they used Snowpark Container Services to:
- Deploy an entire geospatial analytics platform through Native Apps + Containers and build maps with CARTO within your infrastructure.
- Use predictive AI to build highly accurate ML models at scale using Kumo’s Snowflake Native App
In partnership with:
Orateurs
Julian Forero
Senior Manager, Product Marketing
Snowflake
Emily Dillon
Senior Manager, Product Marketing
Snowflake
Dash Desai
Lead Developer Advocate
Snowflake
Jamie Sanchez
Technical Lead
CARTO
Alan Krumholz
Senior Engineer for AI/ML
Kumo
Disha Dubey
Senior Engineer for AI/ML
Kumo