Operational Data Quality is Within Reach
High-quality operational data is possible.
High-quality operational data doesn’t have to be a pipedream, says industry veteran Philip Russom, senior manager with TDWI Research.
In a new TDWI Checklist Report, Russom outlines an incremental approach to ensuring quality data in the context of operational applications and their databases.
For one thing, Russom counsels, shops need to start thinking about data quality in a different way. “We all use the term data quality as if it represents a single, monolithic approach to improving the quality of data. Even data quality experts talk this way, although they know that data quality is far more diverse and complex in actual practice,” writes Russom. “[D]ata quality is, in fact, a collection of techniques. Each technique operates uniquely to achieve specific data improvements or transformations in support of specific business goals.”
Second, he urges, organizations need to recast their data quality issues in business and not solely in technical terms. “For example,” Russom points out, “the business may need greater consistency for data shared across business units, reduced costs for direct mail, a more complete view of customers, or accurate methods for pinpointing locations.” The important takeaway, he emphasizes, is that “business people should determine these needs and communicate them to data management professionals -- not vice versa.”
From this perspective, operational data quality can be approached in an iterative fashion. Russom, for example, suggests starting with a significant (but nonetheless achievable) goal: data quality standardization. “If the business needs more consistent data for sharing, then data standardization is a helpful DQ technique,” he observes. Russom points to TDWI survey numbers showing that data standardization is the most commonly used DQ technique.
Other start-simple scenarios include name and address cleansing, along with augmenting in-house data by purchasing consumer data from third-party providers. Adding other incremental quality enhancements -- such as geocoding -- can likewise bolster the quality of your data.
Choosing the right tool for the job is important, too, Russom says. All DQ tools aren’t created equal, and although a few vendors can credibly claim to address multiple DQ techniques, specialty vendors still abound. In particular, many data quality tools have a history of focusing on customer data, which is just one of the many data domains managed within operational applications and their databases.
Customers have several options: they can mix and match tools, licensing best-of-breed offerings from particular vendors to address their customer, product, financial, supplier, location, or other data quality needs. Similarly, Russom explains, shops can opt to partner with a large vendor for most of their DQ needs while tapping best-of-breed offerings for particular (or specialized) DQ requirements. In some cases, he concedes, it might make sense to partner with a single DQ player.
The key is to strike a balance between cost and convenience on the one hand and functionality and performance on the other. “[I]f you need to implement multiple techniques, you may need to acquire multiple tools. Some smaller vendors have tools for only a few of the established data quality techniques. Large vendors typically have tools for every technique, but the tools integrate and interoperate at varying levels,” Russom observes.
“To alleviate this issue, future-facing vendors are integrating their suites of tools into unified data quality platforms … or even broader [into] data management platforms,” he continues. “[Y]ou need to know what data quality techniques are available … and which ones you need … before selecting vendor tools for evaluation.”
Above all, Russom concludes, bear in mind that operational data "is inherently multi-domain." As a result, he urges, shops should “look for tools with a track record of supporting multiple data domains, not just the customer domain.”