SQL Server 7.0 Demonstrations Top 1 Terabyte

As Microsoft Corp. rolled out SQL Server 7.0 at Comdex Fall in November, two system integrators stood ready with 1- and 2-terabyte demonstration databases built on the new product.

LAS VEGAS -- As Microsoft Corp. rolled out SQL Server 7.0 at Comdex Fall in November, two system integrators stood ready with 1- and 2-terabyte demonstration databases built on the new product.

Unisys Corp. (Blue Bell, Pa., www.unisys.com) unveiled its Customer Behavior and Profitability Application, an online demonstration of a new banking application with 2 terabytes of raw data and indices that require 4 terabytes of storage. Data General Corp. (Westboro, Mass., www.dg.com), demonstrated a test version of its TeraCLIN medical data warehouse with 1 terabyte of raw data and images and 2.2 terabytes of storage. Both databases were designed exclusively for SQL Server 7.0. In addition to the applications, both companies announced system integration services specializing in SQL Server 7.0.

Microsoft unveiled the first SQL Server 7.0 database with more than a terabyte of data in June with its online store of satellite images, TerraServer (www.teraserver.com). The Unisys and Data General demonstrations are the first companies outside of Microsoft to reach a terabyte in size. While not production uses, the demonstrations represent a significant jump by non-Microsoft companies over the 100-GB databases that represent the largest production uses of SQL Server 6.5.

"The application and 2-terabyte database is proof of SQL Server 7.0 and its enterprise-class attributes," Paul Rachal, Unisys vice president and general manager for Enterprise NT Services, says of the Unisys demonstration of its banking application.

Brian Murphy, an analyst with the Giga Information Group (Cambridge, Mass.), sees the large demonstrations from Unisys and Data General as only a small step toward acceptance in the enterprise for SQL Server 7.0.

"It’s unusual for this kind of cast of characters to attract the attention of brand new customers. People who are going to be the leaders in terms of OEM [original equipment manufacturers] are Compaq, HP, IBM ... companies that have basically more mindshare in large enterprises," Murphy says. "The harsh reality is that the overwhelming majority of enterprise transactional systems are running on mainframes and Unix machines. It’s going to be years and years and years before lots of mission-critical computing is done on an NT platform."

Both the Unisys and Data General products are packaged applications built for a specific vertical market. The Unisys application, which can be accessed at www.sql7.unisys.com, helps a bank group customers, then study accounts tend to make or lose money for the bank within each group. Consequently, a banker can avoid trying to sell a customer a type of account the customer is likely to use in a way that would be unprofitable for the bank.

The demonstration database holds records for 4 million households, 10 million accounts and 6 billion individual transactions over a 24-month period. The data warehouse is stored on four Unisys QS/2 enterprise servers running Windows NT 4.0 Enterprise Edition. Each processor in each four-way server is 400 MHz.

Rachal says the amount of information stored in the system is typical of the types of customers Unisys hopes will purchase the system: "Our target is banks with 1 million or more customers."

Raj Tewari, director of Unisys’ Database Solutions unit, expects the types of warehouses the company helps customers build with SQL Server 7.0 to be even bigger. "I anticipate databases on the NT platform going much beyond 2 terabytes in the very near future," Tewari says.

Data General’s TeraCLIN is more of a "proof point," according to Michael O’Neill, director of Data General’s Microsoft marketing. Real solutions that the company expects to sell to customers would contain much less data. "Our target market is pretty much the mid-market space," he says.

The Data General system is designed to answer several key questions: What patient attributes are common across specific clinical diagnoses, what medications are being prescribed for specific diagnoses, and what is the cost/profit breakdown of the services and medication provided at each facility?

Using artificial data, the test system held two years worth of records on 6 million patients, 3,000 facilities, 5,000 doctors, 300 insurance companies and 710 billable services. It consisted of 16, 200 MHz processors on a pair of eight-processor SMP nodes in Data General’s AViiON "Cluster-in-a-Box" configuration with the company’s CLARiiON fibre channel disk storage.

Must Read Articles