Core Platform

Version Control for Snowflake Semantic Views with Terraform

As Semantic Views become a foundational layer for governed analytics and AI in Snowflake, teams are increasingly asking:

How do we version control and automate changes to our semantic layer?

If your organization already uses infrastructure as code (IaC), the answer is straightforward: Manage Semantic Views with Terraform, an open source IaC tool created by HashiCorp, and automate deployment through CI/CD. This post walks through how to treat Semantic Views like code, enabling repeatable, auditable and production-ready deployments.

Why version control matters for Semantic Views

Semantic Views define:

  • Business logic

  • Dimensions and metrics

  • Metric calculations used in BI and AI

  • Governance rules

When those definitions change, it impacts dashboards, APIs and AI agents. While Snowsight provides a powerful and intuitive interface for building and managing Semantic Views, some teams, especially those operating at scale, may benefit from additional software engineering controls around change management. Using Terraform enables:

  • A clear audit trail of changes

  • Peer-reviewed pull requests

  • Safer production deployments

  • Easier rollback through version history

Terraform brings software engineering discipline to your semantic layer.

The workflow: Tables and Semantic Views as code

Terraform Cloud is a managed service that handles your Terraform state, runs and variables. It is designed to securely manage your infrastructure as code.

  • Create a new workspace: Log in to your Terraform Cloud account. Create a new organization if you haven't already. Then, create a new workspace with the "API-driven workflow" option. Give it a descriptive name, like snowflake-infrastructure.

  • Add your Snowflake credentials: In your new workspace, go to the Variables tab. Add the following four Terraform variables. Be sure to mark them as Sensitive so their values are not exposed in the logs.

    • snowflake_organization_name

    • snowflake_account_name

    • snowflake_user

    • snowflake_password

    • snowflake_role

  • Create an API token: You need a token to allow GitHub Actions to communicate with your Terraform Cloud workspace. Go to User Settings -> Tokens and generate a new token. Copy this token immediately as you won't be able to see it again.

Configure your Terraform files

Next, you will create two files in your local repository. These files define the infrastructure you want to create and configure the Terraform Cloud backend.

Using Terraform, you can define a:

  • Warehouse

  • Database and schema

  • Base table

  • Semantic View on top of the table

This feature is currently in preview so you will have to enable this manually as expressed in the file below. 

For example, in this project, we provision:

  • TF_WAREHOUSE

  • TF_DATABASE

  • TF_SCHEMA

  • TF_TABLE

  • TF_SEMANTIC_VIEW with:

    • Dimension: PRODUCT_CATEGORY

    • Metric: AVERAGE_PRICE

Everything is declared in Terraform files and stored in Git.

main.tf

This file declares the Terraform Cloud backend and the Snowflake resources you want to manage.

# This block specifies the required providers for our Terraform configuration.
# We're using the 'snowflake' provider from the Snowflake-Labs namespace.
terraform {
  required_providers {
    snowflake = {
      source = "snowflakedb/snowflake"
    }
  }

  # This is the Terraform Cloud backend configuration.
  # It tells Terraform to store the state file in your Terraform Cloud workspace.
  # Replace 'your-organization' and 'your-workspace-name' with your actual values.
  backend "remote" {
    hostname     = "app.terraform.io"
    organization = "ka_tf_org"
    workspaces {
      name = "snowflake_infrastructure"
    }
  }
}

# The provider block configures the Snowflake provider with your connection details.
# These variables are read directly from your Terraform Cloud workspace.
provider "snowflake" {
  organization_name = var.snowflake_organization_name 
  account_name      = var.snowflake_account_name
  user              = var.snowflake_user
  password          = var.snowflake_password
  role              = var.snowflake_role
  preview_features_enabled = [ "snowflake_table_resource", "snowflake_semantic_view_resource" ]
}

# Create a Snowflake warehouse
resource "snowflake_warehouse" "tf_warehouse" {
  name             = "TF_WAREHOUSE"
  warehouse_size   = "XSMALL"
  auto_suspend     = 600
  auto_resume      = true
  comment          = "Warehouse created with Terraform for a simple example."
}

# Create a Snowflake database
resource "snowflake_database" "tf_database" {
  name    = "TF_DATABASE"
  comment = "Database created with Terraform for a simple example."
}

# Create a Snowflake schema within the new database
resource "snowflake_schema" "tf_schema" {
  name     = "TF_SCHEMA"
  database = snowflake_database.tf_database.name
  comment  = "Schema created with Terraform."
}

# Create a simple table within the new schema
resource "snowflake_table" "tf_table" {
  database = snowflake_database.tf_database.name
  schema   = snowflake_schema.tf_schema.name
  name     = "TF_TABLE"
  comment  = "A simple table created with Terraform."

  # Define the columns for the table.
  column {
    name = "ID"
    type = "NUMBER(38,0)"
  }

  column {
    name = "PRODUCT_NAME"
    type = "VARCHAR(255)"
  }

  column {
    name = "PRICE"
    type = "NUMBER(10,2)"
  }

  column {
    name = "PRODUCT_CATEGORY"
    type = "VARCHAR(255)"
  }
}

# Create a Snowflake semantic view
resource "snowflake_semantic_view" "tf_semantic_view" {
  name     = "TF_SEMANTIC_VIEW"
  database = snowflake_database.tf_database.name
  schema   = snowflake_schema.tf_schema.name
  comment  = "A simple semantic view created with Terraform."

  tables {
    table_alias = "TF_TABLE"
    table_name  = "\"${snowflake_database.tf_database.name}\".\"${snowflake_schema.tf_schema.name}\".\"${snowflake_table.tf_table.name}\""
  }

  dimensions {
    qualified_expression_name = "\"TF_TABLE\".\"PRODUCT_CATEGORY\""
    sql_expression            = "\"TF_TABLE\".\"PRODUCT_CATEGORY\""
  }

  metrics {
    semantic_expression {
      qualified_expression_name = "\"TF_TABLE\".\"AVERAGE_PRICE\""
      sql_expression            = "AVG(\"TF_TABLE\".\"PRICE\")"
      synonym                   = ["Average Price"]
    }
  }
}

variables.tf

This file defines the variables used in main.tf.

variable "snowflake_account_name" {
  type = string
}

variable "snowflake_user" {
  type = string
}

variable "snowflake_password" {
  type      = string
  sensitive = true
}

variable "snowflake_role" {
  type    = string
  default = "ACCOUNTADMIN"
}

variable "snowflake_organization_name" {
  type = string
}

Automating deployment with GitHub Actions

Create the GitHub Actions workflow file

Now, create the pipeline that will run your Terraform code. In your GitHub repository, create a new directory and file: .github/workflows/terraform-snowflake.yml. This file tells GitHub to run Terraform when you push to the main branch.

Before you save this file, go to your GitHub repository's Settings -> Secrets and variables -> Actions and add a new secret named TF_API_TOKEN. Paste the Terraform Cloud API token you created in the Terraform Cloud setup.

# This is the name of your workflow, which will appear in the GitHub Actions tab.
name: "Terraform Snowflake CI/CD"

# This specifies when the workflow should run. It will run on any push to the 'main' branch.
on:
  push:
    branches:
      - main
      
# The jobs section defines the tasks that the workflow will execute.
jobs:
  terraform_apply:
    # This specifies the type of runner to use.
    runs-on: ubuntu-latest

    # This block defines an environment variable for the job, using the GitHub secret.
    env:
      TF_API_TOKEN: ${{ secrets.TF_API_TOKEN }}
      
    # This section lists the steps the job will take.
    steps:
      # Step 1: Checks out your repository's code.
      - name: Checkout repository
        uses: actions/checkout@v4

      # Step 2: Sets up Terraform on the runner and passes the API token for authentication.
      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v3
        with:
          cli_config_credentials_token: ${{ secrets.TF_API_TOKEN }}

      # Step 3: Initializes the Terraform project. This connects to your Terraform Cloud backend.
      - name: Terraform Init
        run: terraform init

      # Step 4: Creates a Terraform plan to show what changes will be applied.
      # This is a best practice to review changes before applying.
      - name: Terraform Plan
        run: terraform plan

      # Step 5: Applies the Terraform configuration. The plan and apply commands are
      # executed within Terraform Cloud's environment. The -auto-approve flag
      # is used to automatically confirm the changes, which is suitable for a CD pipeline.
      - name: Terraform Apply
        run: terraform apply -auto-approve

Once Terraform is configured with Terraform Cloud and Snowflake credentials:

  • Every pull request becomes a reviewable semantic change

  • Every merge to main automatically:

    • Runs Terraform plan

    • Applies the change to Snowflake

  • The state is centrally managed in Terraform Cloud

This means:

  • Semantic View updates are traceable

  • Changes require code review

  • Rollbacks are straightforward

  • Environments (dev/prod) stay consistent

Commit and push to GitHub

With all your files in place and your secrets configured, you are ready to trigger the pipeline.

git add .
git commit -m "Configure Terraform Cloud and GitHub Actions"
git push origin main

After the push, navigate to the Actions tab in your GitHub repository. You will see a new workflow run in progress. You can also log in to Terraform Cloud and see the run happening live under the Runs section of your workspace. Once complete, your Snowflake resources will be provisioned, and any future changes to your main branch will automatically be applied.

What this unlocks for customers

  • Auditable semantic changes: Every metric adjustment, dimension rename, or logic change is versioned and reviewable.

  • CI/CD for analytics: Semantic updates can follow the same pipeline as application code.

  • Safe iteration: You can evolve your semantic model via pull requests instead of manual UI edits.

  • Cross-team collaboration: Data engineers, PMs, TPMs and AI teams can align on semantic logic through code.

  • AI-ready governance: Since AI agents rely on semantic definitions, version control supports consistency and reliability.

What about YAML and semantic studio?

Snowflake’s YAML-based Semantic View definition offers suggestions and assisted authoring. When defining Semantic Views directly in Terraform, you don’t get those interactive UI benefits. However, Terraform-based workflows remain highly valuable for:

  • Production promotion

  • Governance-heavy environments

  • Large enterprises requiring CI/CD rigor

  • Customers with strict change-control policies

A practical customer use case

Imagine a retail company:

  • The data team defines metrics in Terraform.

  • PR updates introduce a new pricing metric.

  • GitHub Actions deploy the updated Semantic View automatically.

  • BI dashboards and Cortex agents can immediately begin using the updated semantic logic.

  • If something breaks, rollback is one commit away.

No manual UI drift. No undocumented metric logic. No surprises.


Forward-looking statement

This content contains forward-looking statements, including about our future product offerings, and are not commitments to deliver any product offerings. Actual results and offerings may differ and are subject to known and unknown risks and uncertainties. See our latest 10-Q for more information.

Subscribe to our blog newsletter

Get the best, coolest and latest delivered to your inbox each week

Where Data Does More

  • 30-day free trial
  • No credit card required
  • Cancel anytime