Data Engineering

Do More for Less: Announcing New Snowpipe Pricing — and 9 Other Ways to Save on Data Engineering Costs

In keeping with the spirit of the holidays — and the season of savings — Snowflake is pleased to announce that the pricing model change to Snowpipe, which went into effect for Business Critical and VPS editions in August, has now been extended to customers on Enterprise and Standard editions. 

Passing along Snowpipe savings is just part of Snowflake’s deep commitment to delivering the best possible value to customers, redefining the economics they need at enterprise scale. As costs around data and AI continue to grow, here we bring you 10 ways Snowflake can help keep data engineering costs in check. 

Happy savings!

1. Get better cost predictability with the new (and more cost-effective) Snowpipe pricing model

As of Dec. 8, 2025, all Snowflake customers will be charged a consistent 0.0037 credits per GB across all Snowpipe services, including file1 ingestion and streaming2. This update from the previous pricing model, which was based on the amount of compute resources used and number of files ingested, will make it far easier for users to predict spending, and the lower cost should result in dramatic savings for data ingestion for most workloads. Based on internal benchmarks from the rollout to BC/VPS edition customers in August, customers are already realizing upward of 50% cost savings from the new pricing model. And the best part, there is no action needed on the customer side to take advantage of these benefits. Customers start getting these savings for their existing as well as new Snowpipe usage automatically. 

2. Achieve enterprise-grade streaming on a budget with Snowpipe Streaming

For lower latency streaming pipelines needs, we are also enhancing Snowflake’s streaming capabilities with the new Snowpipe Streaming architecture by offering a fully serverless, usage-based pricing model that lowers costs for high-throughput, real-time workloads. 

Motorq, a connected vehicle intelligence company, is transitioning from traditional batch jobs to more cost-effective and low latency jobs with Snowpipe Streaming. With this change, the company has reduced data latency from hours to seconds and is cutting ingestion spend by 60%. Furthermore, one customer is reporting up to 30% lower client-side resource costs using our new Rust-based SDK.

3. Onboard new data with confidence: Simpler pricing for data integration with Snowflake Openflow

Predictability is key when managing data pipelines. With Openflow, we have introduced a  simple pricing model at pipeline runtime that utilizes a fixed credit amount per vCPU. This transparency allows teams to budget more effectively without sacrificing on performance. The impact on operational efficiency is tangible; for instance, Dynata has reduced its time to insight by over 75% and reduced the cost of onboarding new data sets by over 50%.

4. Unify architecture and slash costs with Snowpark Connect for Apache SparkTM

For managed Spark environments, direct costs such as compute and storage often make up 30%-40% of the total cost. Meanwhile, a staggering 60%-70% of TCO is frequently "trapped" in indirect costs, such as cluster management costs. But now, with Snowpark Connect for Apache SparkTM, you can execute Spark code directly on the powerful Snowflake engine, drastically reducing costly data movement and egress fees. This unifies your architecture and removes the operational burden of provisioning and patching clusters.

The savings here can be significant. Booking.com saw its runtime improve from 1.5 hours to just 25 minutes with Snowpark Connect. And Chicago Trading Company realized $800,000 in annual TCO savings by leveraging underlying Snowpark execution and the Snowflake engine. Snowflake projects customer TCO savings of approximately 30% over a three-year period by utilizing Snowpark Connect instead of non-Snowflake managed Spark.3

5. Simplify your tech stack with dbt Projects on Snowflake

Managing external infrastructure often incurs additional costs. By running dbt projects natively, customers can dramatically simplify their tech stacks, essentially removing the costs associated with hosting and managing dbt core infrastructure on their own, along with external orchestration tools in some cases. This consolidation drives efficiency. Chris Androsoff, Director of Data at the helicopter air ambulance nonprofit STARS, noted that consolidating on one platform improved cost transparency through simplicity, freeing his team’s engineers to focus on delivering value faster.

6. Enhance pipeline efficiency with Dynamic Tables

Pipeline development is faster and pipelines are more reliable with Dynamic Tables. This declarative approach eliminates the need for manual coding of orchestration and dependency logic, as Snowflake automatically handles refresh logic and processes new data efficiently using incremental updates. Most recently, Snowflake has improved the efficiency of chained Dynamic Tables to provide incremental maintenance more effectively. This helps ensure that your pipelines are running as efficiently as possible, processing only what is necessary rather than re-computing entire data sets. 

Travelpass, a travel experience company, has seen 65% cost savings by switching from its previous platform, Databricks (with Delta Live Tables), to Snowflake with Dynamic Tables. Dynamic Tables’ simplicity and flexibility enables Travelpass to now include more people in the data engineering process and deliver data to business units over 350% more efficiently.

7. Automate data management with Storage Lifecycle Policies

Data retention shouldn't require manual intervention. Storage Lifecycle Policies provide a simple, automated way to manage the data lifecycle — from archiving cold data to deleting expired records. This automation helps save costs and support compliance, allowing teams to focus on innovation rather than infrastructure. By reducing operational complexity, customers can see significant savings on storage costs.4 Network security company Securonix, for instance, reduced its overall storage costs by 50%, compared to its previous cold storage file format, through optimized tiering and policy-driven automation.

8. Enhance global data movement with the Egress Cost Optimizer (ECO)

Sharing data across regions usually comes with a hefty price tag. Cross-cloud data sharing with the Egress Cost Optimizer (ECO) provides cost-efficient data sharing across any Snowflake region or cloud on AWS, GCP and Azure by intelligently routing data. This capability offers potential savings of up to 96% in egress costs (assuming replication from AWS US West to all Snowflake Commercial and Government regions). Data analytics company RavenPack is a prime example, sending data across global regions while reducing its data sharing costs by 14x.

9. Break costly silos across your enterprise systems 

Snowflake helps you cut the costs of redundancy. Unify critical enterprise data sources, including ERP, CRM and more, into Snowflake’s AI Data Cloud to remove silos and reduce redundant data copies and pipelines. Zero-copy, bidirectional integrations with SAP, Salesforce and Workday help you avoid costly data movement while keeping your data consistently available where it is needed.

10. Build fewer pipelines, eliminating unnecessary ingestion with Secure Data Sharing

To unlock insights and AI readiness, you need a comprehensive data foundation. However, much of your data lives outside of your organization, sitting with customers, suppliers, vendors and external partners. With Secure Data Sharing, if the data you need is already in the Snowflake ecosystem, and shared by the data provider via Snowflake Marketplace or Direct Shares, the provider pays for the storage, not you, allowing you to unlock data insights for less. You also avoid the expense of building or maintaining a pipeline to move that data with zero-copy data sharing, which is interoperable, governed and enterprise ready. 

Save with Snowflake

To learn even more about the various business benefits and cost savings you can find with Snowflake, read the Forrester report “The Total Economic Impact of the Snowflake AI Data Cloud” and download “The Simple Guide to Snowflake Pricing” for more details. 

Or for a more interactive experience, try out the brand new Snowflake Pricing Calculator, an online tool that takes into account cloud provider, compute, storage and AI usage and helps you estimate your costs today.

Ebook

The New Essential Guide to Data Engineering

Data engineers are the lifeblood of any data-driven organization. They accelerate time to value and clear bottlenecks that slow productivity. Getting data pipelines AI ready is critical to be able to leverage AI and ML models and to adopt agentic workflows. So it is essential to understand and reevaluate the business impact of data engineering.

 

 For text files (such as CSV, JSON and XML), you will be charged based on their uncompressed size. For binary files (such as Parquet, Avro and ORC), you will be charged based on their observed size regardless of compression.

The high-performance architecture introduces a flat-rate pricing model based on the volume of uncompressed data ingested.

3  TCO is a sum of direct (cloud compute, storage, cold start-up time, over provisioning and data transfer) and indirect (engineering time spent on incidents, re-runs, tuning and SLA run-time consistency) costs. The costs are projected based on a combination of customer success stories, a 2023 third-party report, customer PoCs and Snowflake's internal benchmark data.

4  Savings based on one-year-plus retention with periodic retrieval (10% data retrieved every six months) for enterprise customers, using AWS US East list prices and a simple policy.

Author
Share Article

Streamline RAG with New Document Preprocessing Features

Simplify your document preprocessing for RAG in Snowflake with new PARSE_DOCUMENT and SPLIT_TEXT_RECURSIVE_CHARACTER functions available for Cortex Search.

Snowflake Expands Developer Programmability with Streaming Capabilities, DevOps, and more

Learn more about how Snowflake is extending programmability for developers with new Snowpark Container Services to Run Secure Generative AI, including NVIDIA AI GPUs & more.

Join Snowflake's Media Data Cloud Revolution at Cannes Lions 2023

Join Snowflake at Cannes Lions 2023! Connect with industry leaders, attend discussions, and discover innovations in media, advertising, and entertainment.

AI-Focused Data Leaders to Watch in 2024

Snowflake spotlights 2024's AI-focused data leaders shaping competitive advantage and robust data foundations.

Snowflake acquires Neeva to advance search in the Data Cloud

Snowflake is acquiring Neeva, a search company founded to make search even more intelligent at scale by leveraging generative AI and other innovations.

How Saks Uses the Data Cloud to Provide Memorable Customer Experiences

n a recent episode of The Data Cloud Podcast, we learned how Saks uses Snowflake to keep its operations agile as well as the benefits of a Data Vault methodology.

Tools for the Next Era: The Modern Marketing Data Stack 2025

Discover 2025 Modern Marketing Data Stack, featuring top tools, AI-driven trends, and strategies to help marketers optimize data, ROI and personalization.

Molex Improves Data Sharing, Visibility, and Performance

Explore how Molex harnesses Snowflake's Manufacturing Data Cloud for enhanced data sharing, visibility, and performance in their digital transformation.

How Financial Services Should Prepare for Generative AI

Explore the impact of Gen AI on financial services, including applications, potential risks, and how to leverage LLMs for benefits.

Subscribe to our blog newsletter

Get the best, coolest and latest delivered to your inbox each week

Where Data Does More

  • 30-day free trial
  • No credit card required
  • Cancel anytime