Q&A: Survey Shows Organizations Overly Optimistic About Data Quality
Data quality remains poor at most companies, while data managers are surprisingly unaware of that -- and of the high costs of poor data, according to a survey on the subject.
- By Linda Briggs
- 11/10/2009
The business case to improve data quality remains a tough sell. That’s because companies, according to a new survey, are unrealistically optimistic about the quality of their data in the first place -- although less than a third of companies are actually measuring that quality.
Survey respondents were mainly from North America and Europe and represented companies from a wide spectrum of industries and with annual revenues greater than $1 billion.
Similarly, data managers remain painfully unaware of the high cost of bad data -- even at very large companies. “The typical cost of poor data quality is actually in a range closer to ten times or more than what survey participants believed,” according to Katherine Hamilton of Pitney Bowes, which co-sponsored the survey along with Silver Creek Systems.
Hamilton, director of product marketing in the enterprise business solutions group at Pitney Bowes Business Insight, offered her perspective on the survey.
TDWI: What was most surprising in the survey results?
Katherine Hamilton: There were three surprising outcomes from the survey. First, in studying the results, one can conclude that many businesses are overly optimistic about their data quality. While only 17 percent of the respondents thought their data quality was less than good, only one third of the respondents stated that they have any kind of data quality initiative in place. Further, 63 percent said their companies have never made any attempt to measure the impact of poor data quality on their businesses.
The second surprising outcome was that lack of support from senior management was one of the biggest barriers to improving data quality.
Finally, we were very surprised by how much the respondents underestimated the cost of poor data quality to their businesses. Less than 20 percent believed this figure exceeded $1 million. This is significantly low considering the revenues and sizes of the organizations that responded. The typical cost of poor data quality is actually in a range closer to ten times or more than what survey participants believed.
A third of respondents rated their data quality as poor; only four percent said their data quality is excellent. Why is that number so low, and what is getting in the way of data quality in general for most companies?
Actually, the low figure isn’t surprising, given that 63 percent of those surveyed said they have no measures in place to ascertain the cost of poor data quality and almost half stated they don’t measure the quality of their data at all. Also, according to survey participants, data quality initiatives are not an imperative for management, which makes the business case to improve data quality difficult.
Without information on the impact of poor data quality on an organization, it can be difficult to make a compelling argument for data quality initiatives, which creates a vicious cycle.
Four percent of respondents said their data quality is excellent. Did the survey elicit a sense of what those four percent are doing that is different from many other companies?
Although the survey does not address that point directly, it is interesting to note that among companies that proactively manage their data, the majority are doing so at an enterprise level. Further, 23 percent indicated that they have combined their data quality initiatives under a data governance umbrella, which is highly recommended. Data quality initiatives such as routine data cleansing and ongoing data maintenance should be regular activities. A data governance program will be ineffective if it’s based on incorrect or duplicate data.
Despite poor data quality having been an issue for decades, 63 percent of respondents said they have no idea just what it is costing them. What is a recommendation for those firms for getting a handle on the cost of poor data quality?
Unfortunately, poor data quality is like a virus, as “infected” records permeate the entire organization. Although a large percentage of survey participants couldn’t provide a dollar figure on the price of bad data, they did offer anecdotal evidence -- such as the orphaning of $32 million in stock that was left sitting in the warehouse and couldn’t be sold because it was lost in the system. Bad data can lead to loss of productivity or incorrect billing for products and services.
The first step is creating an active data quality program that will help your organization understand and measure the quality of data and the costs associated with data errors. The next step is implementing routine data maintenance.
Surprisingly, only 37 percent of respondents have a data quality initiative in place, and 17 percent of respondents said they have no plans to start a DQ initiative. What tends to be the issue for companies with no plans to systematically address data quality?
Part of it is a perception problem. Poor data quality has indeed been an issue for a long time, but it’s a relatively new concept that fixing poor data is an affordable and manageable endeavor. Some companies may have been working around their poor data for so long that the impact of bad data may not be immediately apparent.
The evidence of poor data quality and its impact may also be distributed throughout the organization’s many processes, making it harder to detect. In these instances, it may take a significant event such as a major data migration initiative, a compliance breach, or an unexplained customer defection to raise awareness of the true scope of data quality issues.
What were the top problems that firms reported finding with data quality?
The top problems related to data quality included a lack of clear ownership and responsibility for data, duplicate data, and data in silos throughout the organizations. Participants also listed as critical concerns the absence of a data governance policy, and a lack of understanding of what data was accessible and how to access it. They also indicated that poor data quality negatively affects productivity as well as the ability to manage assets and inventory, and results in incorrect billing for products and services. It also raises possible regulatory issues.
The top two barriers to progress with data quality initiatives are “Management does not see this as an imperative” and “It’s very difficult to present a business case.” How can those issues best be addressed within companies?
The first step would be to measure your data quality. When speaking to senior executives, bear in mind that they tend not to care about data per se. Rather, they care about their processes and systems. If no attempt is made to estimate the impact on the organization as a result of bad data, it can be difficult to make a compelling argument for data quality initiatives. By demonstrating bad data consequences, such as incorrect billing or mismanaged inventory, you can successfully argue that data initiatives aren’t too expensive. Rather, the costs of poor data quality are much higher and pervasive across the organization.
What did the survey find are the top barriers to data quality in companies, and what does that finding reveal?
Senior executive management continues to view the issue of addressing poor data quality as a significant cost. However, the business cost of poor data quality can be in the millions, if not tens of millions -- and those are the hard costs. It is always fascinating to speak to customers after they have implemented a data quality solution. Many customers report back not just their satisfaction on fixing the primary problem, but their surprise at how many “unnoticed” or unmeasured secondary and tertiary issues were resolved as well.
What this tells us is that companies who treat their corporate data as a valued asset that needs to be honed and managed will always have the competitive advantage.
Based on the survey results, what trends can we expect in the future, both on the part of companies and from vendors?
It’s our belief that among companies, we will see greater adoption of data quality initiatives. Smart companies will embrace data governance and data quality. Master data management will continue to grow.
Among vendors, we will see more consolidation and we will see vendors expanding beyond customer data and embracing other domains. For example, Pitney Bowes Business Insight is releasing a new Spectrum Technology Platform that contains all of the standard data quality solutions, as well as the location domain for companies that require a singular view of their customers, including location-based information.