A DBMS Alternative for OLTP Challenges
One of the most difficult decisions any IS executive makes is one thatleads to a major change in an organization's core information technology. The investmentin time, money and resources that is tied to enterprise technology is so huge that ittypically takes the equivalent of a paradigm shift to even begin contemplatingreplacement. This understandable resistance to change may, in fact, be the chief reasonwhy decades-old legacy systems are so prevalent in many
When migration to replacement technology is actually viewed as a realistic possibilityrather than as an "if I had it all to do over again" daydream, it's oftenbecause of external factors. No CIO takes this type of change lightly because of thepotential impact on mission-critical applications.
For the IS organization at the Department of Justice State of Bern, Switzerland, thedriving factor for change came, as is typically the case, from the user population. Twoyears after installation, it became painfully clear that growing transaction volumes weremaking it impossible for a mission-critical online transaction processing (OLTP)application to provide satisfactory performance and response for information users. Theeventual result was a change of the enterprise database system from Sybase, Inc.'s SQLServer System 10 to the Caché post-relational database from InterSystems Corporation. Thedecision and evaluation processes behind this core technology transition will hopefullyprovide a roadmap for any IS manager who is confronting performance and scalabilitychallenges in an OLTP environment.
The Department of Justice is responsible for the infrastructure and organization of thecourt system, including Judges, Circle Courts, Primary Courts, Registry of Commerce,Bankruptcy Service and the Landmark Registry in the State of Bern. The IS staff works in a12th-century castle that serves as department headquarters. While the surroundings arequaintly picturesque, the information technology challenges are the same as those faced bytechnologists in many expanding enterprises. The application that raised the performancerequirements and led to the database change is the Department of Justice Business ControlSystem. Called Tribuna 2000, this OLTP application runs on a client/server network thatserves more than 1,400 information users in 43 locations across the state of Bern. Ineffect, it regulates every aspect of the Department's business operation. Schedulingpolice appearances before the Swiss judiciary, assignment of court dates, management ofcourt transcripts and case documentation are just a few of the key operations handled byTribuna 2000.
After two years of operation, it was clear that the Sybase database around which thiscritical application revolved was not delivering the high level of performance andscalability needed by the system users. Transactions coming in each day could no longer becompletely processed before the next day's transactions were received. The result was abacklog of court cases that began to grow at an alarming rate a situation that couldnot be tolerated by Bern's judges or its citizenry. Also critical, the system was somewhatunstable on the Windows NT 4.0 platform.
Search for a Replacement DBMS
Before kicking off the complex process of evaluating new databases, options forimproving performance without replacing the Sybase technology were examined closely. Theproblem was, to some degree, inherent in relational database design, which is optimizedfor decision support and query handling rather than for high-volume OLTP. In order toimprove performance, in-house developers generally create stored procedures of frequentlyexecuted SQL code so that the next time the procedure is performed, the SQL script doesn'thave to be rebuilt. The Department of Justice chose not to take that route because ofcost, support and maintenance issues. Also, it became obvious that changes to theunderlying Sybase technology to improve performance would require changes to the Tribuna2000 application itself. Since this is a purchased application, the end result would be amajor increase in service and maintenance costs, which was economically unacceptable.
An upgrade to SQL Server System II addressed the stability issues on the Windows NTplatform. However, the performance problem remained and was increasing as transactionvolumes continued to grow.
In-depth evaluation of alternative database technologies began in June, 1997. Selectioncriteria included:
- Ability to handle transaction volumes that can be as high as a half-million transactions with daily turnaround and acceptable end-user response
- Ability to support the Tribuna 2000 application without any changes to the application source code
- Scalability across a client/server network with 1,400-plus end users
- Stability on the Windows NT platform
Investigation of multiple database solutions, including relational technology giantsOracle and Informix, led to the conclusion that Caché was the optimum database technologyfor our high-
performance OLTP needs.
From Evaluation to Real-World OLTP
The selection of a database began a migration process that extended over six months.Extensive testing revealed that Caché could provide a quantum improvement in performanceand successfully scale across the multi-tier, distributed client/server network withoutmaking any changes to the Tribuna 2000 application product. The new database alsoeliminated any requirement for in-house technologists to develop code because frequentlyused procedures are prestored automatically.
Based on test results, Tribuna 2000 went live on the new database platform in 1998. Thesystem runs on servers from Digital Equipment Corp. (since acquired by Compaq ComputerCorp.) and PC clients over an Ethernet backbone with Caché residing on the server andODBC on the client. The network includes approximately 600 PC workstations and 20 serversspread across local-area networks in 17 geographic locations.
With the production system rollout, the immediate benefits of implementing databasetechnology that is specifically optimized for OLTP became quickly apparent as court casebacklogs began to disappear. Currently, the system is handling an estimated 190,000 casesannually across the 13 tribunals served by the Bern Department of Justice.
While there is considerable variation in the number of cases managed in each geographiclocation, the overall transaction load can be up to a half-million each day. Based ondepartment records, Caché is performing at rates 20-30 times faster than the relationaldatabase overall and up to 100 times faster on the transaction handling side.
Also critical, the database changeover was made without any need to change a singleline of application code. As a result, the IS organization avoided future maintenance andservice charges that are the inevitable result of source code changes to purchasedapplications.
Performance improvements like these are only possible with technology optimizedspecifically for OLTP environments. For example, Caché, as noted in a report fromInternational Data Corp. (IDC), technology market research specialists in Framingham,Mass., includes multiple concurrency models, online backup capability, server shadowing,replication, and on-the-fly database partitioning. "This dynamic partitioning allowsadministrators to cleave the database on the fly, moving portions to a new server, all thewhile continuing the complete operation of the database with no loss of data and nostoppage of transaction processing," the report states. IDC also observes that"in Caché, both clients and servers retain a local cache of data and methods. Highperformance is achieved because needed data is already cached locally 85 percent to 90percent of the time."
There is no doubt that faster perform-ance was the critical success factor for thisOLTP implementation. However, the switch in database technologies has delivered someadditional benefits. The investment in the new database is about 1 million Swiss francs just over $670,000, which is comparable to the investment in the original Sybase DBMS.However, the total cost of ownership is projected to be significantly less, and the newDBMS is proving much easier to maintain and requires far less system administration.
The success of this DBMS changeover validates a concept that is gaining favor with ISexecutives. The database technology that is optimized for areas such as decision support,data warehousing OLAP may not be appropriate for today's demanding OLTP requirements.
And, those requirements are likely to grow. The Web is driving OLTP to ever-higherdemands for performance and scalability with no end in sight. Even if one ignores theInternet something that no IS executive can afford to do the trend to mergers andacquisitions leading to the formation of mega-organizations across multiple industriesmeans that OLTP performance and scalability demands are on an upward path. So, if and whenthe time comes to re-examine the enterprise OLTP architecture, it can only be to amanager's advantage to examine the possibilities of new database alternatives.
The centuries-old headquarters where IS technologists for the Department of Justicein Bern, Switzerland, are implementing high-performance OLTP systems.
About the Author:
Rolf Streb is the IT Manager for the Department of Justice State of Bern,Switzerland.
A Database Checklist for OLTP
Based on lessons learned and results achieved during the DBMS switch at the Departmentof Justice State of Bern, Switzerland, following are some key questions to ask once thevendor has given you the right answers about issues such as performance and scalability the factors that are the typical drivers behind a core technology change involving OLTP:
How do you address Web transaction processing (TP)? Web connectivity approaches willhave a major impact on TP performance. Look for a database solution that supports directand state-aware connectivity to Web servers the last is key to high-performance Web TP.
What's your approach to object technology? Today, only a small fraction of applicationsare built with object technology. However, the Web is an inherently object-orientedenvironment and the percentages are in favor of object technology being a factor in futureapplications. That's why virtually all-relational technology vendors are offering someflavor of object-relational product with varying levels of success. Even if your currentfocus is on SQL-based application performance, become knowledgeable about variousapproaches to object processing and the trade-offs that come when object layers are addedto traditional database engines. Also be sure to inquire about the level of objectfunctionality being provided and whether the underlying database technology has a provenhistory in enterprise-class environments. The history of object-based systems is full ofexamples of applications that worked very well in pilot mode and then failed to scale upfor enterprise architectures.
How is distributed client/server processing supported? Traditional relational databasesand many object/relational and database technologies do not effectively supportdistributed servers working on the same database, client-side caching, or dynamic databasepartitioning. These features, along with automatic failover and catch-up by a new server,are critical for success in a distributed network.
What is the mechanism for handling complex data? The two-dimensional tablescharacteristic of relational technology cannot effectively support the complex data thatis typical of many large-scale TP applications in a variety of industries includinghealthcare, insurance, financial services, manufacturing and shipping. Complex datamanagement requires a multi-dimensional data model designed specifically for the largenumber of concurrent read-write transactions typical of OLTP applications.
What languages are supported? In addition to industry-standard SQL, support forobject-oriented languages including Java and C++ may be needed for future applicationdevelopment. Similarly, support of standard GUI components, such as Visual Basic orActiveX controls, C++ classes and objects, or Java classes and JavaBeans is desirable.
Does the technology have a proven OLTP track record? It's a given that customerreferences will be investigated before making a decision about core database technology.Be sure to request references to sites where the OLTP demands are at an enterprise level where the concurrent users number in the thousands and the database accesses pertransaction are at a level similar to those projected for the most complex applications.
Once the database choice has been made, the following pointers may prove helpful inimplementing a successful technology changeover:
If dealing with a purchased application, establish and encourage clear lines ofcommunication between both sets of vendor technologists those providing the applicationand those implementing the DBMS. This is critical to ensuring all links operate asintended.
Draw on the services of the database vendor to help plan a pilot that vigorously teststhe application and to assist in the data transfer when the time comes to roll out inproduction mode. With multiple migration experiences to draw on they can providemuch-needed expertise for an operation that occurs only rarely in most organizations.
Check and re-check the backup plan. Then have the vendor review it closely. This isanother area where experience can help avoid costly errors and the technology provider isin the best position to leverage database features designed to eliminate problems beforethey occur.