In-Depth

Key Trends Driving Real-Time Data Access

We look at the important drivers behind the demand for real-time data this year

by Chris McAllister

In 2008, the need for business operations to approach real time will increase as customers and businesses demand more immediate access to information and services. Globalization has created a 24x7x365 world, and it is up to IT to support continuous operations of critical systems as well as the necessary exchange and integration of information across all areas of a business.

With a growing number of business users and activities dependent on real-time access to real-time information, it is nearly impossible to find a company or function that wouldn’t benefit from having accurate, up-to-date data. For equity markets and currency changes, account balances and user authentication, help desks, marketing promotions, supply chain, patient care, and sales and manufacturing, any organization can justify a demand for faster and more accurate information. Key trends will drive the demand for real-time data in 2008, including: standardization of low-latency data integration across disparate systems, stricter regulations and service level agreements (SLAs), heterogeneous IT environments, management and maintenance of very large database (VLDB) implementations, and globalization.

Real-time Data: The New Standard

In today’s business climate, real-time data is no longer a luxury. Corporations need real-time data to meet customer expectations as well as to meet organizational needs for analytics and business intelligence. In a recent IDUG (IBM DB2 User Group) survey on the changing role of the data warehouse, 53 percent of respondents said they could not work with data older than an hour for key applications (see Note 1). Once business comes to rely on a system for look-ups, regular reports, and analytics, it will become dependent on that information to perform key business processes and require that information to be fresh and always available. The burden is on IT to support the business demands for accurate, available, and up-to-the-minute information.

Technologists will feel increased pressure as real-world businesses begin to see the ROI of obtaining data in real time. For example, a major U.S. cable provider was able to reduce the latency of moving customer data from its CRM system downstream to the data warehouse to seconds which impacted the speed at which the customer could receive a response and be fully serviced.

A leading online retailer was able to monitor and act on customer information from operational transaction systems integrated with a central data warehouse where marketing spend is improved by being able to segment and personalize marketing offerings more efficiently. Additionally, customer satisfaction levels improve and customers are more likely to purchase relevant products of interest. As more industries realize the benefit, we can expect the number of real-time data initiatives to grow.

Availability Requirements Become More Stringent

As the “always-on” business climate has extended to more industries in the last few years, the demand for data to keep up with the speed of business, as well as tighter industry regulations and more stringent SLAs, has increased. Many of today’s online applications face tougher requirements, including online banking and bill pay, fraud prevention, point-of-sale systems, clinical information systems, call centers, Web-based shopping, online travel reservations, and supply chains. New revenue-generating products and services have created real-time authorization requirements, driven by the need to control and monitor large volumes of complex, simultaneous transactions. Any degree of system downtime is no longer acceptable in many industries as many products and services can only be accessed via an online method, creating pressure to achieve higher standards of availability.

Ensuring that information is continuously available is paramount when preparing for unplanned outages, such as natural disasters and power disturbances, and in fact many IT groups overlook the length of downtime required for ongoing system maintenance, upgrades, and migrations, which have now become a fact-of-life.

Heterogeneity is the Norm

In decades past, many organizations would adopt one standard, as in an “Oracle-only” or “strictly SQL” shop. Today, organizations are more likely to choose database environments based on a variety of factors, ranging from project needs and job requirements to cost of ownership.

Consider the impact of open source systems and some of the newer platforms that have come about over the last ten years. Enterprises today are more likely to plan for Linux, adopt Microsoft SQL Server more broadly and increasingly for enterprise-grade applications (along with open source databases such as MySQL and Ingres). Today’s data management and warehouse “projects” are more complex than ever, and organizations often need different types of technologies (ETL, EAI, EII, data quality, etc.) to co-exist in increasingly heterogeneous environments.

Finally, acquisitions and consolidation has an undeniable impact on the data center blueprint. As a result, enterprises must be prepared to integrate data from a variety of external sources and disparate systems in the most efficient and cost-effective manner.

VLDB Implementations are Catching Up to Scientific Databases

Today, most “very large” databases (VLDB) can typically be measured in tens of terabytes, with a few reaching hundreds of terabytes. Terabyte-sized databases were previously inconceivable and were thought to be, at the very least, difficult to manage and maintain. Although terabyte databases are still considered large, they are becoming more common. Over the next year and beyond, industry VLDB implementations will begin to catch up to scientific databases, with more and more reaching “peta-scale.” Building a multi-petabyte database at a reasonable cost can already be done, however it requires significant effort. With the challenges of large data volume, real-time requirements, and public availability of data in minimal time, the push toward the “new peta-scale industry VLDB” will continue to progress.

Globalization Heightens Data Sharing Urgency

Globalization means IT operations are increasingly spread across different locations worldwide, making the speed and accuracy of data sharing more critical. Factors such as outsourcing and multiple, distributed business locations increase the need for fast, seamless, and accurate data integration. Despite storage that is collected and housed in different locations, fresh data must be made available to business applications. Data sharing and replication in real time provide the essential movement and management of data across any geographic distance. Business globalization will continue to be one of the most significant drivers behind the demand for real-time data.

Outlook: In 2008, Real-Time Data will Rise up the Priority List

Organizations still have a long way to go in realizing their data management initiatives, whether data integration, consolidation, business intelligence, or maintaining system availability. According to Gartner research many organizations are still delivering strategic data integration initiatives using tactical and outdated approaches (see Note 2). There continues to be a gap between the business demand for real-time data and the reality of IT fully implementing this capability.

With data volumes and the global demand for shared information growing, industry regulations targeting accuracy and availability of data becoming more stringent, and IT architectures growing more heterogeneous, real-time data access is poised to be a major priority for both IT and the business decisionmakers in 2008. With IT and business at a crossroads, all eyes are on the data.

Notes:

1: IDUG, August 2007:“Moving From Analytics to Operational Business Intelligence: The Changing Role of the Data Warehouse”

2: Gartner, October 22, 2007: “Roundup of Business Intelligence and Information Management Research 3Q07”

- - -

Chris McAllister is the senior director of product management for GoldenGate Software, where he is responsible for GoldenGate product planning and direction, as well as product training and documentation. You can reach the author at [email protected].

Must Read Articles