This session will discuss how to securely and effortlessly deploy custom LLM models in Snowflake using Snowpark Container Services. The demo will address the challenges of managing extensive client information in sales and showcase a Streamlit-based LLM AI assistant, providing salespeople with a one-stop shop to access customer status.
Join us in this live-demo to learn how to:
- Create a simple Q&A that generates account overviews from a mix of structured and unstructured data using tagging and sentiment analysis
- Retrieve context-based account detail through RAG
- Serve large token LLMs in Snowpark Container Services with multi-node, multi-GPU compute on Ray clusters
Orateurs
Muzz Imam
Senior Product Manager
Stella Yang
Principal Data Scientist