In-Depth

Five Steps to Data Quality in Enterprise Mergers

As merger and acquisition activity picks up this year, there will be an even heavier emphasis on the quality of data combined from the two enterprises

Oracle's acquisition of PeopleSoft, Honeywell International's pact to buy Britain's Novar, and recent reports of a Sprint/Nextel combination are clear signals that 2005 may emerge as the year of mergers and acquisitions (M&A). Companies are trying to grow in an extremely competitive marketplace with downward pressures on evaporating margins. Now that Corporate America is sitting on its largest cash reserves in more than a decade—and is willing to spend—it becomes evident that M&A will become the conduit for advancing strategies and objectives of a business.

However, mergers are getting increasingly complex. Business operating environments have changed significantly over the past several years with increased regulation on market share, auditing and reporting. Add to that situations created by the last large wave of mergers—environments where companies did not consolidate existing systems and data—resulting in the need to convert from multiple source systems to a single target system that might not be the core application of the acquiring organization. Now factor in the latest requirements associated with Sarbanes-Oxley, Basel II, Patriot Act, and other operating control pressures. Multiply this scenario by tens of applications (or more), and the execution of data movement becomes a much higher risk area than ever before while the tolerance for incorrect information and data decreases.

The demand for quality enterprise information is at a fevered pitch, and an increased reliance on enterprise technology and information requires the underlying integrated data to be accurate. This makes data quality even more important in executing mergers flawlessly. What can companies do to ready themselves for this new wave of mergers, from a data and information integrity standpoint? How can they better manage the issue of data quality?

The next wave of mergers will bring with it new complexities that can stress even the most prepared companies. Understanding and mapping products, systems and data flows and improving the quality of organizational data—before integration—are critical to success. A comprehensive data quality and integration strategy utilizing a set of proven tools and best practices to reduce the manual data movement processes can keep a large, complex merger on time, on track, and within budget.

Our five-step plan for managers addresses these challenges.

Step 1: Map Your Products, Systems, and Data Flows Before the Next Merger

The most fundamental step toward ensuring that data survives a complex merger is one many companies either have not taken or not maintained since the last merger wave: establish a blueprint of the Enterprise Application and Information Architecture. To complete this, a company should develop a map documenting all of the enterprise's customer groups, products, business functions, systems applications, as well as information data flows and technical architecture presently in place to support the company. Developing these maps from various viewpoints will give organizations the information needed to drive its information strategies and to remedy current data quality issues before taking on additional acquisitions.

Once an acquisition is announced, the organization can chart all of the acquired company's applications and information flows and compare them to current environment maps. This process will allow organizations to quickly identify, assess, and close major gaps during the integration effort.

Step 2: Assess and Address the Data Quality Issues

The challenge of providing reliable and accurate information from an enterprise architecture has never been more important. Yet companies continue to struggle with data quality issues. In fact, recent studies predict that almost one half of all strategic, data-driven initiatives may fail due to denial about data quality issues. Thus, companies need to place strong emphasis on performing upfront data analysis—in as thorough and accurate manner as possible—to understand anomalies, redundancies, and inaccuracies in source data before attempting to extract, clean, and integrate it.

Data quality assessments, performed with best practices and automated tools and technologies, can be completed in a relatively short period. Utilizing proven processes and techniques such as domain studies, structural inferences, redundancy inferences, and data-rule validation ensures a focus on:

  • Completeness
  • Domain conformance
  • Formats and patterns
  • Consistency
  • Reasonableness
  • Attribute value integrity
  • Structural integrity
  • Date, time, numeric conformance

That process and focus enables companies to build a solid foundation of complete and accurate source system knowledge and be well positioned to quantify the impact of data quality on strategic, data-driven initiatives.

Step 3: Implement a Data Quality and Integration Strategy

By leveraging data quality assessments, companies can make informed business decisions to address anomalies, redundancies, and inaccuracies in source data. There are two approaches to data quality strategy implementation: active and passive. With an active approach, data quality is improved at the point of capture within source systems through manual corrections, refined software governing data entry, reengineered business processes guiding data entry, and enhanced training programs. With the passive approach, data quality is improved outside the source system using automated tools and technologies during data movement and data cleansing processes. Cost/benefit analyses play a major role in guiding active or passive data remediation; however, regardless of the chosen approach, data quality improvement must be continuous—not a one-time event. To maximize return on investment, companies must implement people, process, and technology programs to continually monitor and improve data quality to maintain consistent, accurate, reliable, and trusted data.

Step 4: Manage Data Movement with More Brains, Less Brawn

Strategic, data-driven initiatives such as data acquisitions, integrations, and migrations are far more likely to succeed when performed by subject-matter experts who use proven methodologies and automated tools and technologies. This comprehensive approach reduces costs up to 35 percent and accelerates implementations up to 50 percent. The following, proven, best practices can contribute to your success:

  • Augment your team with data acquisition, integration, and migration specialists from an established data management-consulting firm that emphasizes client empowerment through mentoring and knowledge transfer.

  • Use a data profiling methodology to comprehensively understand the source data being acquired, integrated or migrated; to establish a foundation for correcting data quality issues; and to improve the team's ability to predict the overall timeline, work effort and costs.

  • Use proofs of concept—on a scaled version of the technical architecture—to verify the level of effort required for acquiring, integrating, or migrating data.

  • Use a proven, documented methodology to facilitate the acquisition, integration, and migration of data

  • Use tools and technologies to automate and accelerate the acquisition, integration, and migration of data.

Step 5: Make Data Quality an Ongoing Business Priority

Too often, organizations spend significant financial and people resources on one-time projects fixing data quality problems. Then they implement additional change events without including continued data-quality measures. Organizations—especially active acquirers—need to elevate this issue of data quality. In prioritizing data quality, an organization must identify who owns the data. In most cases, multiple internal groups share common data. This requires a Data Quality Champion responsible for cutting across multiple, internal business and technical organizations to ensure that management decision-makers have the correct data.

By making data quality an ongoing business priority, an organization ensures that all of these efforts become adopted best practices—not just a one-time clean-up process.

Measure Twice, Cut Once

Mergers are a given in today's competitive business climate. Moreover, there's little doubt that the characteristics of the next wave of mergers will feature compliance requirements, IT complexities, and an almost real-time need to rapidly provide quality data to support enterprise business processes. Be ready by using these five steps to ensure a solid proactive framework for merger and acquisition success.

Must Read Articles