In-Depth
Time-Sensitive BI Determines Competitive Advantages
The ability of information systems to perform designated OLTP functions ina timely manner has been an obsession with the IT community for years. The success of ITprofessionals has been directly tied to their system's ability to keep up with thetransaction volumes of the business. In support of this, an entire discipline of the TPperformance metric has developed. As we enter the 21st century, the time has come for thissame level of obsession to be applied to the Business Intelligence arena.
Discussions on metrics to measure the "performance" of a businessintelligence system must be very different from the discussions for a TP system. Where itis very easy to judge the efficiency of a TP process, discussions of business intelligenceare fundamentally discussions of change and an organization's ability to adapt to thatchange.
Appropriate metrics to measure the value of a business intelligence system shouldinclude:
Depth & Breadth. Does the system provide enough detail? Does the systemprovide a wide enough view of the business problem?
Responsiveness & Flexibility. Quick response times promote more refinedquestions. Can the system meet the user's conceptual model of their business?
Accessibility. Where the decision takes place can be as important as thedecision itself.
Accuracy. Does the system capture all the data at any point in time? How does itdeal with latency?
Timeliness. Is the information available to the user before they need it?Time-to-first query is as important as query response time.
Consistency. Can the system respond to the user's needs in a consistent timeframe, regardless of the query posed?
Historically for many industries, the above measurements simply did not make sense.This was due, in part, to the source of data needed to perform business intelligence.Large companies were simply consumers of syndicated data that was delivered equally to allparticipants in the industry. Furthermore, the data was already prepared for theindustry's use. There was very little opportunity, need or legal right to enhance orchange this data. Competitiveness depended strictly on how the data was used (read:queried) with no room to add competitive advantage through the data process itself.
This status quo, however, has changed. Companies in industries that were handcuffedwith only syndicated data have found reservoirs of data within their own organizations andare struggling with how to extract and use it. Even companies in industries without anysyndicated data are beginning to explore their own data warehouses and operational systemsfor information that can be used in business intelligence.
Of the above metrics the overriding one, and the one that ultimately providescompetitive advantage, is timeliness. The other factors are important as well, but mainlyin how they impact the timeliness of the data delivery and its intrinsic value. In theend, the sooner an organization can start using its data, the more value it will derive.
Timeliness
Any given data point has a distinct shelf life. The value of a data point is notmeasured in the event that produced the data point, but in how that point is used and bywhen. Its value proposition varies over time, moving from short term tactical to long termstrategic and finally losing the majority of its value as its individuality gets lost overthe fullness of time (see Figure 1 on page 47).
The issue for many organizations is how to capture the data point early enough in itslife cycle to reap the most benefit. For many business intelligence systems,unfortunately, the data point is not captured and put into use until it has passed intoeither its Mid-Term or Strategic stage. Only the most cutting edge organizations recognizeand can act on the value of a data point in its early Tactical stage.
Barriers to Early Stage Value
Exploiting data at an early stage requires that an organization, and its supportingbusiness intelligence system, overcome a number of obstacles. These obstacles are datacapture, recognition, reconciliation, storage, staging and delivery. Taken together, theseobstacles add up to one thing a significant time delay between event creation and itsuse in a business intelligence system.
Depending on the sales and/or process strategy of an organization, the time from theoccurrence of an actual event to when it first appears in their system can varyconsiderably. Compare for example ATM activity being reported back through the federalsystem to a member bank, versus a car purchase at a dealer being reported back to themanufacturer. The ATM activity will post to the member bank in a matter of minutes whileit might take until month's end before the car sale is communicated to the manufacturer.
Once the event appears on the system it must be recognized as an event to be stored,its validity must be checked and only then can it be processed through the transactionsystem. In most organizations, it is at this point that the event is considered ready forwarehousing or further processing into a business intelligence system.
Given the inherent delays of moving the event through a TP system, the challenge thenmust be toward shortening the remaining time needed to make the event actionable. It is inthis area where business intelligence systems stand to improve the most. In today'smarket, very few systems are capable of taking the data from its raw form and moving itdirectly into an actionable data format. This industry is fragmented into players who canextract the data, or extract and transform, or extract, transform and load, but very fewwho can do all of the above as well as stage the data for distribution and later analysisin a "business intelligence" friendly form.
Consider the situation of a corporation who, not unsurprisingly, is an MVS shop for themajority of their large data processing. With very few exceptions, a business intelligencesystem will first require that company to download the data they will need to fill therequirements of the BI applications. Note that the company is also expected to handle thesheer volume of the raw data, or engage in an expensive application development effortthat will pre-summarize the data into a smaller form before it is downloaded. At thispoint an advanced system that specializes in transforming this data into information,typically on UNIX or NT, will work with the data, then pass it along to yet another systemfor staging. This next system could be on the same system as the transform engine orperhaps on another type of platform altogether. At this point, and only at this point, isthe data ready for business intelligence use.
But what if the data that is needed for business intelligence is needed NOW? What ifthe time value of the data is so eroded by the time the data is actually turned intoactionable information that it can provide only minimal ROI? Or, what if you just don'tknow the actual time value of the data?
Most business justifications for business intelligence assume that the maximum valuefor a data point, or its aggregate information set, is flat that "whenever thedata arrives is good enough." Intuitively this cannot be true. Yet it is accepted inall but the most extreme circumstances. Why? Mostly because the end users who see thisdata for the first time are either trying to understand its value and uses, or are sograteful to just get it at all that they are not going to "rock the boat."
Time Sensitive Systems
The first step in building a time-sensitive Business Intelligence system must be toinventory all the data that might be a candidate for business intelligence. The data mustbe rank ordered by its time value. Those with the shortest slope to full value should beaddressed first. It is important to remember that while we are discussing data's timevalue you must also consider the data's information value. The distinction between thedata's value and its derived information value can be imperative. One piece of data mightbe promoted in the priority list not because of its direct value but because of itsinformation value.
Consider, for instance, a large telecommunications company that requires immediateanalysis of millions of call data records on a 60-day rolling basis by the next morning.How does this company begin to tackle such a monumental task and one so intrinsic toestablishing its competitive advantage? The answer lies in technology that enables the enduser to have access to the complete answer set using what might be considered a datahologram. Using a mathematical representation of all possible answers to every possiblequestion, impossible amounts of data become easily and quickly accessible, enabling enduser query times to be measured in seconds.
Key to this approach is the careful selection of which data can bring the most value tothe end users quickest. The high value data is immediately processed directly throughtheir high performance data mart solution, while data with a delayed value equation isstaged for later loading into a warehouse. Using this combination of mart and warehousethe time value of all the data available is maximized.
About the Author:
Matthew Doering is the Vice President of Product Management for QueryObject SystemsCorporation (Uniondale, N.Y.). He can be reached at [email protected].