In-Depth

Business Intelligence: Next Generation Business Intelligence

The hallmark of successful companies in the coming years will be their ability to provide all external users the ability to perform dynamic trending, drill down analyses and what-if scenarios on both historical and realtime information.

Companies that maintain extranets today will need to take the next step and provide "intelligent extranets" in 2000 and beyond to stay competitive and maintain customer loyalty.

These next-generation "intelligent extranets" will provide more than point-in-time information. They’ll offer trending, forecasting, drill down, ranking and other analytic applications that give customers, partners and suppliers a 360-degree view of their relationship with the company and their own performance.

Intelligent extranets are particularly critical for companies that sell commodity products or services, such as computers, 401k plans, chemicals and so on. These companies are quickly realizing that their only real competitive differentiator is the value of the information they offer along with their products and the flexibility and ease with which users can access that information.

Intelligent extranets are also critical to ensure the liquidity of emerging B2B exchanges and auctions. Participants in these marketplaces need dynamic access to up-to-date, detailed information stored in various databases within their supply chains.

Without proper information, these folks can’t make intelligent bidding decisions and will refrain from using the exchanges or auctions as a primary vehicle for conducting business.

Requirements

Creating an intelligent extranet or intranet should be easy, right? Just attach your data warehouse to the extranet or intranet and you’re in business. However, besides the obvious security implications, the process of providing access to hundreds, if not thousands, of internal and external users over the Web is immense.

Web users vary significantly in their familiarity with online analytical tools and understanding of analytical techniques and processes. Having been spoiled by the immediacy and interactivity of the Web, most won’t wait more than 10 seconds for an answer or spend more than a minute learning to use a new function. They don’t care about the location, format, cleanliness or atomic level of the data they’re querying, and they expect information to be presented to them in terms they understand.

They also want to view both historical and up-to-the-second data from multiple sources -- text, numeric, image, audio and so on -- in a single document comprised of easily understandable charts, tables and other graphical items. They want to drill down to atomic-level data to generate mailing lists, cross-selling recommendations and purchase orders to close the loop between analysis and action.

New Tools

Tools with a "fat-client" architecture, that access only summarized, numeric data that are batch loaded into a single database supporting a specific schema and dimensional model, probably won’t support your emerging Web-based information requirements.

What’s needed is a new type of analytical tool that meets the information requirements of Web users. Let’s call this new type of tool an Internet Intelligence (II) platform.

An II platform consists of one or more Internet-based servers that deliver a variety of analytical services to large numbers of geographically-distributed users who want quick, intuitive and filtered access to large volumes of detailed data stored in many different types of systems and locations.

The II platform is more than a tool because it consists of a set of extensible services that can be easily accessed by other tools and applications using standard APIs. They make it easy for software developers or information service providers to embed intelligence into their core applications. In essence, II platforms help turn "intelligence" into a ubiquitous commodity.

Some of the leading business intelligence vendors, such as Business Objects, Cognos and MicroStrategy, are renovating their existing product lines to meet II platform requirements, while start-ups, such as nQuire, Metagon, Quansoo Group and others, are stepping up to fill the gap.

II Highlights

Consider the following 12 requirements for II platforms:

1. Open. The ideal II product is a platform, not a tool. All functionality is available through a standard API (defacto or otherwise), such as COM, SQL or CORBA. This enables any application to embed "intelligence" into its core functionality or any analytical tool to interface with the II platform.

2. Server-Based. Next-generation II platforms are open, component-based server engines rather than client/server tools driven by end-users, eliminating the need to download and maintain plug-ins, applets, ActiveX objects or cab files on the client.

3. Transparent. II platforms support a metadata repository and views that shield users from back-end data complexity and volatility, letting them browse information, using terms and rules with which they’re familiar.

4. Distributed. II platforms contain rules to query and integrate heterogeneous data stored in distributed systems within and across corporate boundaries. The II platform understands differences in formatting, atomicity, dimensionality, schemas and historicity of source data stored in transactional or decision support systems.

5. Timely. Because of its distributed capabilities and rich metadata, the II platforms can provide users and e-business applications with timely data.

6. Intuitive. II platforms are as easy to use as a Web search engine. In fact, the query paradigm should be more of a search paradigm, with keyword or natural language "searches" as the norm and a customizable user interface.

7. Self-Adapting. II front-ends dynamically to adapt to the user’s skill level. The tool exposes only as much functionality as users are ready to use, and no more than they can handle.

8. Handles Complex Queries. The tools can support rankings, percentiles, moving averages, ratios, contributions and other types of dimensional calculations.

9. Scalable. Runs on high-performance SMP machines and supports clusters to handle load-balancing and failover applications and connection-pooling to increase the query throughput against large volumes of detailed data.

10. High Performance. The II platform can warn users when queries will take longer to run or reschedule complex queries to run after-hours.

11. Secure. The II platform contains a security mechanism that manages data and application functionality at an object level.

12. Manageable. The II platform comes with a consolidated administration console for managing performance, metadata, design and changes.

About the Author: Wayne Eckerson is Director of Research at TDWI. He can be reached via e-mail at weckerson@dw-institute.com.

Must Read Articles