Explore how Snowflake’s latest product features support easy, connected and trusted end-to-end AI workflows, from data source to application. Hear from partners about how interoperable AI is the future for incorporating best-in-class functionality by streamlining data sharing without compromising on security and governance.
AI in Action
How to Accelerate to Production
In collaboration with
Register Now

In the race to AI success, organisations require integrated data + AI capabilities that allow them to rapidly build and scale the next generation of AI-powered applications. Discover how Snowflake’s unified and interoperable platform accelerates time to value by reducing manual configurations, encouraging data sharing, and supporting end-to-end governance, privacy and security for trusted, adopted AI workflows.
Join us to get the latest generative AI and ML innovations, see the technology in action with demos, and hear real-life stories about how Snowflake helped drive productivity in their day-to-day use cases as Snowflake experts and key partners share demos, best practices and how to:
- Accelerate unstructured data processing from ingestion to downstream analytics
- Build AI data agents to make all your data accessible to agents and applications
- Streamline end-to-end machine learning workflows and develop models faster and cheaper with container-based runtime
Agenda at a glance
Unlock the Full Potential of Gen AI with Hands-On Prompt Engineering
Learn how to build and refine prompts from scratch, showcasing how to harness LLM-powered features like Cortex Analyst and Document AI. You will see real-time development, practical use cases and best practices to get the most out of Snowflake’s AI capabilities — whether for extracting insights, automating workflows or enhancing data pipelines with AI-driven intelligence.
Using LLM-as-a-Judge: Observability and Evaluation for Gen AI
Discover how to instrument an AI application with Trulens, an open source observability framework, to trace each component of your application and use LLM-as-a-Judge evaluations to identify and improve points of failure. You’ll get best practices in enabling observability in your AI workflows, common feedback functions used for evaluation, and observability frameworks as a mechanism to compare application versions to determine which version is best fit for production use.
How Coinbase Developed an End-to-End ML Platform
Learn how Coinbase builds end-to-end ML workflows on top of Snowflake’s platform for optimal data security, governance and price performance. Coinbase now automates batch and online inference on predictive ML models to quickly and accurately unban users who were initially incorrectly flagged as suspected fraud or bots, resulting in an improved user experience and increased revenue.