Introducing Polaris Catalog

An open source catalog for Apache Iceberg

Polaris Catalog

An open source catalog for Apache Iceberg

Polaris Catalog logo

Ask the Experts: Polaris Catalog

Our Iceberg catalog experts will answer your questions live during this Q&A-style webinar. They’ll also cover use cases for Iceberg and Snowflake.

Forward Looking Statements
This page contains forward-looking statements, including about our future product offerings, and are not commitments to deliver any product offerings. Actual results and offerings may differ and are subject to known and unknown risk and uncertainties. See our latest 10-Q for more information.

Cross-engine read and write interoperability

Take advantage of open standards to read and write from any REST-compatible engine.

Centralized access across engines

Manage Iceberg tables for all users and engines in one place.

Vendor-agnostic flexibility

Run Polaris Catalog in your infrastructure of choice — in Snowflake or your own with containers.

Maintain one data copy that many engines can query and write to

Iceberg’s open source REST protocol provides an open standard allowing you to read and write from any REST-compatible engine — Apache Flink, 
Apache Spark, PyIceberg, Trino and more.

Streamline data management with centralized access

Centralize management in Polaris Catalog, where every user and engine can find and access Iceberg tables from one place.

Run anywhere without lock-in

An open source, vendor-neutral catalog enhances interoperability with Iceberg. Soon in public preview, get started with Polaris Catalog hosted on Snowflake’s AI Data Cloud infrastructure in minutes, or self-host in your own infrastructure — either way, no lock-in. Should you want to swap your underlying infrastructure, you retain all Polaris Catalog namespaces, table definitions, and more.

Polaris Catalog branding

Introducing Polaris Catalog

Introducing an open source catalog for Apache Iceberg, enabling cross-engine interoperability