Le Lézard
Subjects: Photo/Multimedia, Survey

Data Downtime Nearly Doubled Year Over Year, Monte Carlo Survey Says


Monte Carlo, the data observability company, today announced the results of its second annual State of Data Quality survey. The report reveals data downtime?periods of time when an organization's data is missing, wrong, or otherwise inaccurate?nearly doubled year over year (1.89x).

The Wakefield Research survey, which was commissioned by Monte Carlo and polled 200 data professionals in March 2023, found that three critical factors contributed to this increase in data downtime. These factors included:

More than half of respondents reported 25% or more of revenue was subjected to data quality issues. The average percentage of impacted revenue jumped to 31%, up from 26% in 2022. Additionally, an astounding 74% reported business stakeholders identify issues first, "all or most of the time," up from 47% in 2022.

These findings suggest data quality remains among the biggest problems facing data teams, with bad data having more severe repercussions on an organization's revenue and data trust than in years prior.

Data Quality Tradeoffs

The survey also suggests data teams are making a tradeoff between data downtime and the amount of time spent on data quality as their datasets grow.

For instance, organizations with fewer tables reported spending less time on data quality than their peers with more tables, but their average time to detection and average time to resolution was comparatively higher. Conversely, organizations with more tables reported lower average time to detection and average time to resolution, but spent a greater percentage of their team's time to do so.

"These results show teams having to make a lose-lose choice between spending too much time solving for data quality or suffering adverse consequences to their bottom line," said Barr Moses, CEO and co-founder of Monte Carlo. "In this economic climate, it's more urgent than ever for data leaders to turn this lose-lose into a win-win by leveraging data quality solutions that will lower BOTH the amount of time teams spend tackling data downtime and mitigating its consequences. As an industry, we need to prioritize data trust to optimize the potential of our data investments."

Other Findings of Note

The survey revealed additional insights on the state of data quality management, including:

"Data testing remains data engineers' number one defense against data quality issues - and that's clearly not cutting it," said Lior Gavish, Monte Carlo CTO and co-founder. "Incidents fall through the cracks, stakeholders are the first to identify problems, and teams fall further behind. Leaning into more robust incident management processes and automated, ML-driven approaches like data observability is the future of data engineering at scale."

To read the full report, including commentary and reactions from nearly a dozen industry-leading data executives, click here.

To learn more about how organizations are making data more reliable with Monte Carlo, visit www.montecarlodata.com or request a demo.

About Monte Carlo

As businesses increasingly rely on data to drive better decision making and power digital products, it's mission-critical that this data is trustworthy and reliable. Monte Carlo, the data observability company, solves the costly problem of broken data through their fully automated, SOC-2 certified data observability platform. Billed by Forbes as the New Relic for data teams and backed by Accel, Redpoint Ventures, GGV Capital, ICONIQ Growth, and IVP, Monte Carlo empowers companies to trust their data.



News published on and distributed by: