The Data Mesh Mirage: When Decentralization Becomes Chaos
Zhamak Dehghani’s data mesh principles swept through the data engineering world like a revelation. After years of struggling with monolithic data warehouses and bottlenecked central data teams, the idea of treating data as a product and distributing ownership to domain teams felt like liberation.
Three years later, a significant number of organizations that adopted data mesh are quietly abandoning it.
Not because the principles are wrong. They aren’t. But because the gap between the theory and the implementation reality is wider than most organizations can bridge.
What Data Mesh Gets Right
Let’s give credit where it’s due. Data mesh correctly identifies three real problems:
1. Central data teams don’t scale. When every analytics question routes through a central team, that team becomes a bottleneck. Request queues grow. Context gets lost in translation between domain experts and data engineers. The data team builds pipelines for domains they don’t understand.
2. Domain knowledge matters. The marketing team understands marketing data better than a central data engineer. The finance team understands financial data better. Ownership should align with knowledge.
3. Data infrastructure should be a platform. Instead of every team building their own data pipeline infrastructure, a central platform should provide self-service tools that domain teams consume.
All correct. All important.
What Goes Wrong
The Talent Problem
Data mesh assumes that every domain team has — or can hire — engineers capable of building and maintaining production data products. This assumption is false for most organizations.
A typical mid-market company has 3-5 strong data engineers. Data mesh asks you to distribute them across 8-12 domain teams. The math doesn’t work. You end up with domain teams that own data products they can’t maintain, and a platform team that’s responsible for everything but empowered to fix nothing.
The Standards Problem
“Federated computational governance” sounds elegant in a white paper. In practice, it means 12 teams making independent decisions about data quality standards, naming conventions, SLA definitions, and schema evolution policies.
Team A defines “customer” as anyone with an account. Team B defines “customer” as anyone who’s made a purchase. Team C defines “customer” as anyone who’s been active in the last 90 days.
When the CEO asks “how many customers do we have?” — you get three different numbers and a 45-minute meeting to figure out whose definition is correct.
The Incentive Problem
Domain teams exist to serve their domain’s business objectives. The sales team’s job is to close deals. The marketing team’s job is to generate leads. Neither team has an incentive to maintain a high-quality data product for another team’s consumption.
“But it’s a cultural shift!” advocates say. Sure. But cultural shifts don’t happen by declaration. They happen when incentives align. And in most organizations, there’s no reward for maintaining a data product that primarily benefits another department.
The Decision Framework
Data mesh is not always wrong. But it has prerequisites that most organizations don’t meet. Use this framework:
When Data Mesh Works
- Large organizations (500+ engineers) with mature engineering practices
- Strong existing data culture where domain teams already have data engineering capability
- Executive sponsorship with real accountability for data product quality
- Sufficient talent to staff domain-level data product teams (minimum 2 data engineers per domain)
- Mature platform team capable of providing self-service data infrastructure
When Data Mesh Fails
- Mid-market organizations (50-200 engineers) without deep data engineering talent
- Low data maturity where basic data governance hasn’t been established
- Siloed culture where cross-functional collaboration is aspirational, not actual
- Budget constraints that prevent hiring dedicated data engineers for each domain
The Middle Path
For most organizations, the right answer is neither a fully centralized data warehouse nor a fully decentralized data mesh. It’s a hybrid:
- Central platform team provides data infrastructure, quality frameworks, and standards
- Domain teams own data models but use centralized tooling
- Central governance defines enterprise-wide definitions (what is a “customer”?)
- Domain-level stewards (not full data engineers) maintain domain data quality using the platform’s tools
This isn’t as intellectually pure as data mesh. It doesn’t make for a good conference talk. But it works for organizations that exist in the real world.
The Honest Assessment
Before adopting data mesh, answer honestly:
- Can every domain team hire and retain a data engineer? If not, who will maintain the data products?
- Do you have executive sponsorship for cross-domain data standards? If not, who will resolve definition conflicts?
- Is your data platform mature enough to provide self-service tooling? If not, you’re asking domain teams to build infrastructure AND data products.
If any answer is “no,” data mesh will create more problems than it solves. And that’s okay. Start with a strong central data team, mature your practices, build the platform, and revisit the decentralization question in 18-24 months.
The Garnet Grid perspective: Data architecture decisions should be driven by organizational capability, not conference hype. We help organizations choose the architecture that fits their reality — not their aspirations. Explore our data engineering consulting →