Breaking the Streaming and Batch Silos

No more separation of streaming and batch pipelines. Unified ingestion and transformation in a single system.

Simplify Data Pipelines in One System

Unify stream and batch processing pipelines in just one architecture. Stream and process data at low latency exactly where your historical data is. Processing streaming data is as easy as writing a CTAS (Create Table As Select) SQL query — with Snowflake Dynamic Tables, customers can use simple, ubiquitous SQL to create low latency, incremental pipelines for an array of streaming use cases.

Optimize Cost Without Wasted Compute

Streaming ingest for rowsets is as much as 50% cheaper than file ingestion at the same volume. Dynamic Tables help you avoid wasted compute by providing performance guidance with incremental or full refresh for more efficient transformations.

Take Advantage of the AI Data Cloud

Since streaming capabilities are deeply integrated with the AI Data Cloud, you can still  enjoy the security and governance capabilities you’ve come to rely on through Snowflake Horizon. 

Stream Data with <10-Second Latency

Snowflake helps you handle both streaming and batch data ingestion and transformation in a single system. 

Stream row-set data in near real time with single-digit latency using Snowpipe Streaming, or auto-ingest files with Snowpipe. Both options are serverless so you can scale more easily and manage costs more effectively.

Adjust Latency with a Single Parameter Change

With Dynamic Tables, you can use SQL or Python to declaratively define data transformations. Snowflake will manage the dependencies and automatically materialize results based on your freshness targets. Dynamic Tables only operate on data that has changed since the last refresh, making high data volumes and complex pipelines simpler and more cost-efficient.

Easily adapt to evolving business needs by making a batch pipeline into a streaming pipeline — with a single latency parameter change.

Connect seamlessly to upstream data

Effortlessly stream data using Snowflake’s native integrations with upstream sources. Built on top of the Snowpipe and Snowpipe Streaming frameworks, Snowflake provides versatile options to meet your streaming needs, including Snowpipe for Apache Kafka1 and Snowflake’s native connectors for PostgreSQL2 and MySQL,2 and Apache Kafka. Scale data ingestion securely and easily for use cases such as change data capture (CDC), slowly changing dimensions (SCD) type 1 and type 2, and other real-time analytics. 

 

Bring Streaming to Open Lakehouse

Snowflake’s streaming capabilities work with Apache Iceberg format to help you build an open lakehouse architecture easily with versatile processing options. Load files continuously and automatically into Iceberg Tables with Snowpipe and Snowpipe Streaming. Build low latency, declarative processing with dynamic Apache Iceberg Tables.3  

Transform data in Python, Java or Scala with Snowpark.

San Francisco

Travelpass

50%

Faster data refresh operations with 18x more data

300x

Faster data refresh in Tableau dashboards

sf

65%

Cost savings by switching from their previous platform, Databricks, to Snowflake

350%

Improved efficiency to deliver data to business units, thanks to Snowflake Dynamic Tables

Hand pointing at a data chart on a computer screen

Start your 30-DayFree Trial

Try Snowflake free for 30 days and experience the AI Data Cloud that helps eliminate the complexity, cost and constraints inherent with other solutions. 

Private preview
Public preview
Iceberg support in public preview