Data engineers are often working with complex integration needs but are constrained by rigid pipelines and resources. This largely prohibited the speed of innovation, especially to deliver new AI initiatives. Now with Snowflake Openflow, data engineers can build open, extensible and secure data integrations that enable real-time, scalable and bi-directional data movement. Powered by Apache NiFi, Openflow offers customers the option to use self-contained deployments and runtimes in Snowpark Container Services, as well as the ability to run bring your own cloud (BYOC) deployments in the customer’s cloud VPC. In this technical session, we will dive into the platform architecture.
Learn how to bring all your data together with Snowflake Openflow to build with limitless interoperability and deployment flexibility, no matter if it’s structured, unstructured, batch or streaming data. Make your integration pipelines ready for AI and the future.
講演者

Sam Lachterman
Senior Product Manager
Snowflake

Shiyi Gu
Product Marketing Lead
Snowflake
Register Here