Distributed domains and data Products: harnessing Snowflake and dbt Cloud with a customer success story from Coalesce 2023

Jai Parmar of Snowflake explains how using Snowflake and dbt address reduces complexity when scaling data.

"Data mesh is not a silver bullet…It doesn't work that way."

Jai Parmar, AI & ML Senior Solutions Architect at Snowflake, explains how Snowflake and dbt help address complexity when scaling data–introducing new features such as dbt mesh and a revamped semantic layer. He also delves into the concept of data mesh and the impact it’s having on organizations today.

Data mesh is a revolutionary approach in the data community that decentralizes data operations

Jai explains the benefits of implementing data mesh in different organizations. "Data mesh has been one of those fundamental things in the data community, whereby it's actually starting to proliferate between all the different areas of the organization. It goes through technology, goes through people, and process," says Jai. He also emphasizes the importance of organizational change, data ownership, and the definition of incentives and KPIs for data teams in successful data mesh implementation.

Jai also shares the learnings from their data mesh engagements: "Data mesh is not a silver bullet. It's not something that you actually think, ‘Okay, you can just build it, and it's going to work’…Setting up a hub and spoke model, when working with building domain teams and building technology, is actually really important."

Snowflake can be leveraged to build data capabilities

Jai draws attention to the benefits of the Snowflake platform, emphasizing its distributed nature, data-sharing capabilities, and its compatibility with third-party tooling. "Snowflake is a distributed platform... you don't have to worry about the bells and whistles. Literally, just get to work, and it continuously works for you," says Jai.

He further highlights the importance of using partner tools and services in conjunction with Snowflake to deliver business value. “DataOps is one of the fundamental things to drive time-to-value, so ensure that you leverage that," he advises.

Practical use cases of data mesh and Snowflake implementation demonstrate its effectiveness in driving business value and improving operational efficiency

Jai shares examples of organizations like Flexport, which initially migrated their database to Snowflake and later adopted a decentralized approach by implementing data mesh. "Once they built their source-oriented data products, they made it available to the wider ecosystem within the organization," he explains.

Another example was of a worldwide agency that built everything on dbt, including both source-oriented and consumer-oriented data products. These products were then shared to the different accounts in Snowflake. To find more examples, Jai adds, "Please take a look at the website for our details about what goes on and how we've done data mesh for other customers…”

Jai’s key insights

  • The success of data mesh is due to its ability to remove the boundaries that occur through a centralized approach, thus reducing bottlenecks and decreasing time-to-value
  • Building data products requires focus on various areas, including organizational change, data ownership, incentives and KPIs for data teams, and data governance
  • Snowflake's platform allows for the easy sharing of data between different databases and accounts, without moving the data
  • Building a DataOps capability with dbt is fundamental in driving time-to-value
  • Data mesh is not a silver bullet and requires significant investment in people and processes
Related Articles

Register for Coalesce 2024

Join us in-person or online for the largest analytics engineering conference. Level-up your skillset, expand your network, and build your path at Coalesce 2024.