What Is Driving the Struggle for Data Quality in Modern Enterprises?

Enterprises today are increasingly grappling with data quality challenges due to a combination of outdated data architectures and a general lack of innovative data culture within organizations. This issue is brought to light in the 2024 State of Analytics Engineering survey conducted by dbt Labs, which involved 456 data practitioners and leaders to offer a comprehensive overview of the current state of data management and quality in organizations.

Prevalence of Data Quality Concerns

A significant 57% of survey respondents identified poor data quality as their primary concern, a notable increase from 44% in 2022. This indicates a growing acknowledgment of data quality issues within enterprises, highlighting just how pervasive this problem has become. Additionally, the survey pointed out that low stakeholder data literacy and ambiguous data ownership were major concerns affecting nearly half and 44% of the respondents, respectively. These findings emphasize that data quality is intertwined with broader challenges in data governance and stakeholder engagement.

Time and Resource Allocation in Data Management

Data practitioners reportedly spend 55% of their time maintaining or organizing data sets. Coupled with this, 40% cited the integration of data from various sources as their biggest challenge, underlining the labor-intensive nature of current data management practices. These statistics reflect the substantial time and resources allocated to managing data rather than deriving valuable insights from it. It also highlights a critical need for streamlined processes and better tools to manage and integrate data effectively, allowing data teams to focus more on strategic data tasks.

Investment Trends

Despite these challenges, nearly 40% of companies plan to maintain their investment in data quality, platforms, and catalogs, with 10 to 37% intending to increase their investments, depending on the category. This investment trend underscores a recognition of the importance of data quality and the need for robust data management solutions. Companies are beginning to realize that without addressing the root issues, their data initiatives, including AI and machine learning applications, may not reach their full potential.

Three Root Causes of Data Quality Issues

Lack of Innovative Data Culture

Historically, there has been a neglect in focusing on data collection, management, and reuse, with enterprises often prioritizing application development over data quality. This has resulted in fragmented or trapped data within underutilized applications, hindering effective data utilization. An innovative data culture that emphasizes the importance of high-quality data and continuous improvement practices is essential for overcoming these issues.

Legacy Data Architectures

Many organizations continue to rely on outdated, costly legacy data architectures that do not scale efficiently. This creates what Cinchy terms an “integration tax,” consuming over 50% of IT budgets. The persistence of these old systems adds complexity and cost to data integration efforts, thereby impairing data quality.

Failure to Address Integration Complexity

Complicated architectures drive up integration costs, further exacerbating data quality issues. Effective solutions include adopting a standards-based semantic graph architecture to simplify integration. Simplifying these architectures can help mitigate costs associated with data integration and improve overall data quality.

Upstream Data Management Strategies

Pushing proactive data management efforts upstream—closer to the data source—can significantly improve data quality. This approach provides better control and understanding of data, reducing the challenges associated with repurposing data collected further downstream. By focusing on data quality at the point of origin, enterprises can ensure that data remains reliable and useful as it moves through the organization.

Industry Example: Direct Lithium Extraction

An analogy to effective upstream data management can be drawn from the process of Direct Lithium Extraction (DLE) in the oil industry. Just as DLE extracts valuable lithium from brine at oil field sites, proactive data management can extract valuable insights and ensure quality data by focusing on its collection and management at the source. This analogy underscores the importance of addressing data quality issues at the earliest stages of data collection.

Approach to Data Management

To overcome data quality challenges, it is vital for organizations to adopt a producer’s mindset towards data. This involves owning and managing data lifecycle processes effectively to create reliable, reusable data products. Treating data as a product ensures that it is consistently managed, monitored, and improved upon, leading to better data quality and more reliable insights.

Overarching Trends and Consensus Viewpoints

In today’s fast-paced business environment, enterprises are increasingly facing significant challenges related to data quality. This struggle stems primarily from outdated data architectures and a pervasive lack of innovative data culture within organizations. The magnitude of this problem is highlighted in the 2024 State of Analytics Engineering survey, conducted by dbt Labs. This extensive survey engaged 456 data practitioners and leaders to provide a detailed snapshot of the current landscape of data management and quality within businesses.

The findings underscore the criticality of transitioning from old data frameworks to more advanced and adaptable solutions to ensure accurate, reliable data. Organizations must recognize that fostering an innovative data culture is essential for not only addressing these data quality issues but also for driving business growth and efficiency. By embracing new technologies and cultivating a forward-thinking data culture, businesses can better harness the power of their data, leading to more informed decision-making and improved outcomes.

Explore more

Trend Analysis: AI in Real Estate

Navigating the real estate market has long been synonymous with staggering costs, opaque processes, and a reliance on commission-based intermediaries that can consume a significant portion of a property’s value. This traditional framework is now facing a profound disruption from artificial intelligence, a technological force empowering consumers with unprecedented levels of control, transparency, and financial savings. As the industry stands

Insurtech Digital Platforms – Review

The silent drain on an insurer’s profitability often goes unnoticed, buried within the complex and aging architecture of legacy systems that impede growth and alienate a digitally native customer base. Insurtech digital platforms represent a significant advancement in the insurance sector, offering a clear path away from these outdated constraints. This review will explore the evolution of this technology from

Trend Analysis: Insurance Operational Control

The relentless pursuit of market share that has defined the insurance landscape for years has finally met its reckoning, forcing the industry to confront a new reality where operational discipline is the true measure of strength. After a prolonged period of chasing aggressive, unrestrained growth, 2025 has marked a fundamental pivot. The market is now shifting away from a “growth-at-all-costs”

AI Grading Tools Offer Both Promise and Peril

The familiar scrawl of a teacher’s red pen, once the definitive symbol of academic feedback, is steadily being replaced by the silent, instantaneous judgment of an algorithm. From the red-inked margins of yesteryear to the instant feedback of today, the landscape of academic assessment is undergoing a seismic shift. As educators grapple with growing class sizes and the demand for

Legacy Digital Twin vs. Industry 4.0 Digital Twin: A Comparative Analysis

The promise of a perfect digital replica—a tool that could mirror every gear turn and temperature fluctuation of a physical asset—is no longer a distant vision but a bifurcated reality with two distinct evolutionary paths. On one side stands the legacy digital twin, a powerful but often isolated marvel of engineering simulation. On the other is its successor, the Industry