Core Platform

Accelerate Your Migration: Turn SSIS Packages into Native Snowflake + dbt Projects with SnowConvert AI

Enterprises are increasingly moving to modernize their legacy on-premises ETL pipelines, such as those built in SQL Server Integration Services (SSIS), seeking the power and flexibility of the cloud. This modernization is an opportunity to move to a platform built for today's data workloads. Modern cloud data platforms, such as Snowflake, are designed to eliminate information silos by unifying data in a single, governed location. They provide the immense, scalable compute resources needed to handle any volume of data, allowing organizations to perform powerful transformations in place and deliver insights faster and more efficiently.

In an ELT architecture, raw data is loaded directly into a powerful cloud data warehouse like Snowflake. Transformations are then performed in place, leveraging the platform's immense, scalable compute resources to deliver insights faster and more efficiently.

To address this challenge, we’re introducing a new SSIS replatform path in SnowConvert AI that turns SSIS packages into native Snowflake + dbt projects. We keep the control flow and data transformations inline — so you can debug in one place, avoid naming collisions and ship faster with dbt best practices from day one. The solution isn't just a new approach; it requires a fundamental paradigm shift from traditional ETL (extract, transform, load) to the modern ELT (extract, load, transform) model. 

Why we chose our approach

Migrating SSIS isn’t just a matter of “translating this component to that component.” It’s untangling control flows, variables and data movement that have lived together for years. Our approach preserves that context inline:

  • One place to debug: Containers and branches are converted in line inside parent Snowflake procedures or tasks. No bouncing across tools to understand why something ran (or didn’t run).

  • Deterministic orchestration: Single-use packages become Snowflake Tasks with explicit dependencies. Reusable packages (invoked by other packages) become procedures for clean, synchronous reuse.

  • Fewer naming collisions: We consistently sanitize object names across dbt models, tasks, procedures and variables, so deployments remain predictable in shared environments.

  • Familiar, but modern: Data movement and business logic land in dbt with layered models and macros, while orchestration runs natively on Snowflake. Same mental model as SSIS — without the engine lock-in.

Why this makes migrations safer (and faster)

The inline control flow provides immediate visibility into execution, scope and data, which shortens the feedback loop for finding and fixing issues. Using dbt from day one establishes a robust, modern foundation (tests, lineage, CI/CD) that eliminates the need for costly rewrites.

  • Inline = insight: When control flow is inline, execution order, variable scope and data paths are visible at a glance. That shortens the “find it, fix it, rerun it” loop.

  • dbt from day one: You start on a modern analytics engineering stack — tests, docs, lineage, CI/CD — without rewriting everything twice.

  • Fewer footguns: Consistent naming and structure reduce breakage during deployments and make cross-project references boring (the good kind).

Reuse without the pain. Turning reusable packages into procedures avoids async quirks and keeps parameterized runs predictable.

How it works

1) Data flows to dbt Projects on Snowflake

Each SSIS data flow becomes its own dbt project with three opinionated layers:

  • Staging (source-friendly schemas, light transforms)

  • Intermediate (joins, splits, lookups, standardization)

  • Marts (destination tables and incremental models)

Materializations are sensible by default (view/ephemeral/incremental/table), and you can override with config() at the model level. The result is readable model names, clear lineage and CI-ready SQL.

2) Control flow to native Snowflake orchestration

We generate an orchestration SQL file per package:

  • Standard packages → tasks with explicit AFTER relationships and an init task to seed variables.

  • Reusable packages → procedures that can be CALLed with parameters (great for loops or multitenant execution).

Data flow tasks call into the corresponding dbt project, so dbt remains the source of truth for transformation logic.

3) Variables that behave across layers

SSIS variables are centralized in Snowflake (read via UDF, write via procedure), and exposed to dbt through lightweight macros. That gives you a single pattern for variable access — whether you’re inside a task, a procedure or a dbt model.

What is available today

  • Runnable output on day one: SnowConvert emits a ready-to-run structure that cleanly separates transforms (dbt) from orchestration (tasks/procs) under a consistent project layout.

  • Broad component coverage: Common SSIS sources, destinations and transforms (Derived Column, Lookup, Merge/Join, Conditional Split, etc.) map directly to dbt models and SQL.

  • Clear, actionable reports: We flag items that need attention and call out expected behavioral differences so you can prioritize fixes and align stakeholders early.

How to get started (in preview)

  1. Open SnowConvert AI and choose Replatform for SSIS.

  2. Review the generated dbt + orchestration and address any flagged items in the report.

  3. Deploy and run your dbt projects in Snowflake; orchestrate with the generated tasks and procedures.

Share Article

Subscribe to our blog newsletter

Get the best, coolest and latest delivered to your inbox each week

Where Data Does More

  • 30-day free trial
  • No credit card required
  • Cancel anytime