BUILD: The Dev Conference for AI & Apps (Nov. 12-14)

Hear the latest product announcements and push the limits of what can be done in the AI Data Cloud.

Partner & Customer Value

Enabling Data Mesh Principles for Organizational Agility

Enabling Data Mesh Principles for Organizational Agility

With demonstrable success across a range of industries, organizations are increasingly pursuing cutting-edge data mesh architectures to enhance self-service data use. Given the popularity and continued growth of data mesh, many of today’s leading compute, storage, and data security platforms are also evolving to support and enable it. 

In this blog post, we’ll review the core data mesh principles, highlight how both organizations and modern data platforms are putting those principles into action, and demonstrate just how achievable a secure and efficient data mesh architecture can be. 

The four data mesh principles

Fundamentally, data mesh architectures are built on four core principles: 

  1. Domain-centric ownership and architecture: The teams that work within/most closely with each respective domain are responsible for its functionality and upkeep. 
  2. Data-as-a-product: By considering data resources through a product lens, teams can adopt practices centered around quality and ease of use.
  3. Self-service data platforms: Consistent domain-agnostic access and security measures create a structure that is both clearly defined and repeatable across platforms/domains.
  4. Federated computational governance: Whether manual or automated, global data governance and security measures ensure protection and compliance throughout domains.

In writing, these principles seem relatively straightforward: Place data in the hands of domain owners, make it accessible to users, and keep it secure with consistent policies. However, the actual application of data mesh principles is often a hurdle for organizations. 

Finding platforms that can enable both open access and consistent security can be challenging. On top of this, it can be hard for teams to secure buy-in from their organization when pushing for an emerging concept or strategy. How, then, are modern data teams finding success with the data mesh? 

How Roche is putting data mesh principles into action

Roche Diagnostics, the Swiss multinational healthcare organization, was relying on legacy data architectures that were prone to bottlenecks, slow release cycles, massive engineering pipelines, and frequent data siloing. This system quickly became unmanageable as Roche sought to make more efficient use of its data resources. 

According to Roche’s Head of Data Management Platforms Paul Rankin, the team realized that “Roche in itself actually is quite advanced when it comes to analytics, and there’s quite a large maturity in the domains themselves, in the business units.” With this realization, the challenge became clear: How could Roche enable these teams to manage their own domains securely to drive business objectives? 

Rankin and his team found a solution to this problem in the budding concept of the data mesh. They knew that if they were able to secure both organizational buy-in and technological support, they could build a new architecture that put the power of decentralized data use in the hands of their high-performing domain teams. Leveraging the distributed nature of the Snowflake Data Cloud, they were able to move from the previous monolithic architecture to a distributed and domain-oriented data mesh format. 

Still, implementing this new architecture was not without its challenges. The team needed to ensure that data access was controlled sufficiently in the new decentralized model. The first attempt at access control was role-based, assigning each domain user one of three specific roles. Claude Zwicker, Accenture’s Lead Data Architect, worked with Rankin’s team on this data mesh project. Zwicker said after implementing this role-based model “it didn’t take a week or two and we were flooded with requests for customized roles.” It was clear that more dynamic, scalable controls were needed to enable secure data access in the data mesh. 

Roche was able to eliminate its role-based access management burden and reduce required access groups by 94% by using Immuta’s dynamic attribute-based access controls and table grants subscription policies. 

You can listen to Rankin and Zwicker recount Roche’s full journey from legacy platforms to the data mesh in this presentation.

Snowflake & Immuta for data mesh enablement

As Roche’s experience demonstrates, combining Snowflake and Immuta’s respective capabilities can enable dynamic, decentralized data mesh architectures to make more efficient use of data resources. 

Snowflake Field CTO Matthias Nicola and Immuta Co-Founder and CTO Steve Touw discussed in a joint webinar the organizational and technological implications of a data mesh implementation. The process begins with an organizational focus, to help understand the non-technical implications of this paradigm shift and how they will affect the way teams operate with data. It’s also crucial that teams consider the price tag of the data mesh. 

“When you are building something that is distributed in nature … something distributed can easily become more complex and more costly than something centralized,” said Nicola. “So, having a focus on cost and simplicity is really important.” 

Lastly, there is a need for federated governance to support the distributed nature of data mesh architectures. As Touw notes, “You need global standards that are applicable to your business, but you also want to empower your domain owners … to be able to build their own rules.” 

How do Snowflake and Immuta address these requirements?

Organizational alignment

Establishing comprehensive proof-of-value for both technical and non-technical internal stakeholders can be challenging. With Snowflake’s Snowpark API, data teams can work in a common language (Python or SQL) and increase models in production by 10 to 20%. For business stakeholders, time-to-action can be accelerated by 2X-3X to drive more efficient results. Similarly, Immuta users can write universal access and security policies as code or in plain language, giving both technical and business teams a better understanding of the rules governing their Snowflake data resources.   

Cost optimization

Transitioning to a data mesh architecture is meant to reduce unnecessary costs, not become an extra financial burden. Snowflake helps users manage and optimize their spend with its fully managed platform, native cost optimization and performance capabilities so organizations like Roche can strike the balance between domain ownership and governance standards while lowering total cost of ownership. Likewise, teams employing Immuta’s dynamic access controls to secure sensitive data across their distributed architectures have experienced annual savings in the millions, all while optimizing their data resources and improving analytics.

Federated governance

With a distributed data ecosystem, it is essential that data is governed and secured regardless of which domain it lives or is accessed in. Immuta allows users to create policies that merge with global governance policies without manual changes or overlap across all domains. This consistent governance can be balanced with efficient access, as Immuta can accelerate time-to-data up to 100X. 

Want to learn more about how Snowflake and Immuta can power a secure data mesh architecture that enhances your organizational agility? You can watch the full Powering Your Data Mesh with Snowflake and Immuta webinar, or try this self-guided demo for a hands-on look at our joint capabilities.

How to Knit Your Data Mesh on Snowflake

Share Article

Subscribe to our blog newsletter

Get the best, coolest and latest delivered to your inbox each week

Start your 30-DayFree Trial

Try Snowflake free for 30 days and experience the AI Data Cloud that helps eliminate the complexity, cost and constraints inherent with other solutions.