Data Engineering

Announcing the General Availability of dbt Projects on Snowflake

dbt

Data transformations are the core building blocks of any effective data strategy, crucial for constructing robust data pipelines. For years, data teams have relied on dbt (data build tool) to bring software engineering best practices — such as modularity, version control and testing — to SQL and Snowpark transformation workflows.

But the process hasn't always been seamless. Data teams and platform owners often run into some common challenges:

  • Infrastructure overhead: Managing compute for an external orchestrator (such as Airflow), in addition to Snowflake, adds maintenance complexity and can potentially reduce reliability across disparate systems.
  • Debugging challenges: Logs and performance data are spread across the orchestrator, query logs, making it hard to find root causes and bottlenecks.
  • Governance gaps: It's hard to let new teams build and deploy pipelines, especially when there's a steep learning curve and uniform security is a challenge.
  • CI/CD setup: Setting up robust, automated continuous integration and continuous delivery (CI/CD) for data transformation code often requires significant custom engineering effort to ensure quality and rapid deployment. 

Now, the power of dbt is available natively on Snowflake. dbt projects on Snowflake enable your data team to build, run and monitor dbt projects directly in Snowflake. With the new Workspaces editor, the next generation of SQL authoring in Snowflake, teams can edit and debug projects. dbt projects on Snowflake offer full parity with Snowflake CLI to manage deployments and testing of dbt projects via CI/CD tools, such as GitHub Actions. These native options reduce context switching, simplify setup and accelerate the entire data pipeline development lifecycle.

code build
Figure 1: A look at the dbt project CI/CD lifecycle.

“At a nonprofit that delivers life-saving care every day, every dollar counts. When we rebuilt our data and analytics platform, we needed right-size tooling that balances capability with simplicity and cost. The moment dbt became part of the Snowflake ecosystem, the path was clear. Today we experiment, codify, test, deploy, schedule and monitor our entire dbt workflow natively inside Snowflake. Consolidating on one platform has created helpful simplicity, improved cost transparency and freed our engineers to focus on delivering value faster.”

Chris Androsoff
Director of Data at STARS

With dbt projects on Snowflake, teams collaborate to build modular and scalable data products to deliver downstream analytics, AI and applications. Customers previewing this feature reported increased confidence in their ability to build (+34%) and troubleshoot (+11%) their transformation pipelines within a single day.1

1 Between April and June 2025, we surveyed 17 first-time users before and after using dbt projects on Snowflake to measure improvements in build and troubleshooting speed.

blognov

Learn about dbt projects on Snowflake in a demo from Charlie Hammond.

 

Accelerate development with dbt projects on Snowflake

dbt projects on Snowflake streamline workflows for data engineers to standardize and automate transformation pipelines, by allowing for:

  • Development and testing: Create, upload and edit dbt projects in Workspaces, using a file-based IDE, which integrates with Git. Perform test runs for data quality and validate models.
  • Visualization and debugging: Compile and visualize directed acyclic graphs (DAGs) to inspect lineage and dependencies directly in the UI.
  • Deployment and orchestration: Deploy and schedule data pipelines using native Snowflake tasks, simplifying orchestration. Select from various dbt commands such as COMPILE, TEST, RUN and more, right from the native Workspaces IDE.
  • Monitoring and tracing: Monitor execution history with fine-grained logging and tracing. 

Get started today

Whether you want to import an existing dbt project from Git or start from scratch, it’s easy to get started with dbt projects on Snowflake:

  1. Navigate to Snowsight Workspaces.
  2. Choose to create or import a dbt project from a Git repository.
  3. Run your project using an existing Snowflake virtual warehouse.

Try our getting started tutorial or grab code from Snowflake Labs. The operational efficiency, standardization and simplified developer experience this feature delivers will enable more teams to build and deploy modern data products. Learn more in Snowflake documentation or visit the Developer Guides page to get hands on.

 

Ebook

The Essential Guide to Data Engineering

A modern data engineering practice produces fast, reliable and quality data for all of an organization’s business units. It can help you easily and securely share data across your organization, ecosystem and more.
Share Article

Turn Data Into Intelligence In Your Everyday Workflows

The new Snowflake Cortex Agents App, now on the Microsoft Marketplace, lets you chat with your secure Snowflake data in plain language inside Microsoft 365 Copilot and Teams.

A Breakthrough AI-Powered SQL Assistant

Announcing the general availability of Snowflake Copilot, an AI-powered SQL assistant that simplifies data analysis and maintains robust governance.

Supercharging the Developer Workflow for AI with Snowflake's Integrated Dev Environment

Snowflake supercharges the AI developer workflow with an integrated dev environment offering familiar tools, collaboration and controlled data access.

3 Questions Marketers Should Ask When Evaluating AI Solutions

Explore key considerations for marketing teams to ensure data security and smooth vendor integration in AI modernization projects.

Accelerate development and productivity with DevOps in Snowflake

Enhance data team productivity with Snowflake DevOps. Streamline workflows with Git integration, Python APIs, and CI/CD automation for faster, secure deployments.

Modern Analytics Stack: Snowpark, dbt, and Python

Take a look at how dbt Labs is leveraging Snowpark for Python to build efficient data app pipelines and a modern analytics stack.

Build Better Data Pipelines with SQL and Python in Snowflake

See the new features to help data engineers build and orchestrate scalable data pipelines with SQL and Python—simplifying workflows and boosting agility.

Empowering Data Engineering Today for Tomorrow’s Challenges

Empower data engineers with the tools to build efficient pipelines, integrate AI and prepare your data stack for the future in an increasingly complex tech landscape.

Snowpark Connect for Apache Spark™ in Public Preview

Discover how Snowpark Connect for Apache Spark™ lets you run Spark code on Snowflake's engine—eliminating Spark environment overhead and simplifying performance.

Subscribe to our blog newsletter

Get the best, coolest and latest delivered to your inbox each week

Where Data Does More

  • 30-day free trial
  • No credit card required
  • Cancel anytime