The Case for Legacy Technology: Beyond New vs. Old

Pairing legacy systems with server-based systems that have complementary strengths can extend the life of legacy system

by Greg Enriquez

The information technology arena thrives on the notion that newer is better. Many aspects of modern IT are certainly far superior to technology used to build systems 10, 20, or 30 years ago. Yet many of those old "legacy" systems remain in use today, and often have years of productive life ahead of them. In this case, legacy refers to any application source code base that was written before 1993, and is thus more likely to be written in COBOL or Basic than C or Java.

When the march toward the future becomes reflexive rather than considered, companies may make decisions about their legacy technology based solely on newness. That often takes them from a situation of relative IT strength into a period of needless cost and disruption started by underestimating their legacy systems' long-term viability.

The assumption that a newer system must be better just because it's new is an obvious canard to anyone with enterprise systems experience. Legacy enterprise systems almost always perform their intended task better than server-based systems. They are more secure than server platforms because of their built-in redundancy and relative immunity to viruses. The legitimate and most often-cited reasons for replacing enterprise systems are their inflexibility and maintenance costs. Balanced against legacy systems' benefits, however, those points are not sufficient to justify implementing new systems as soon as they're available.

Deciding whether to retain or replace legacy systems is more than a "new versus old" choice. In many cases where an IT organization decides to replace its legacy systems, the more profitable strategy is to implement a proven architecture that draws on the respective strengths of legacy and server systems to deliver more value than either could alone.

Surround is Sound

Enterprise systems deliver the most value in architectures where they perform essential processing and data security functions. Server-based applications surround the legacy core, supporting the user interface, communication, and querying functions. This model has worked successfully for 15 years in the telecommunications and financial services industries and adapts easily to other industries.

In Internet banking, for example, applications for making payments and checking balances are on server platforms that provide flexibility for adding and modifying services. The server platforms regularly extract data from the legacy system and use it to conduct transactions with other institutions. However, the legacy system is the final arbiter. It reconciles debits and credits through its overnight batch process and establishes the new, authoritative account balance. Similarly, the customer can use a Web-based system to change their personal information, but the Web-based system feeds that information to the legacy system to make the change official.

Telecommunications billing is another example of legacy and server systems working effectively together. When competitive long-distance carriers Sprint and MCI marketed innovative pricing plans and bundled services to challenge AT&T in the late 1990s, the latter could not modify its billing system quickly enough to protect its market share. AT&T's legacy billing platform was designed to process huge volumes of transactions in overnight batches according to rates hard-coded into the applications. Re-coding to enter new rates, add another service, or merge charges onto a single bill was expensive and slow because legacy systems were built for power and security, not the flexibility a competitive marketplace demands. The solution was to ring the main billing system with server platforms that extracted the data from the hierarchical database and loaded it into convergent billing applications, based on relational database management systems (RDBMS), that supported services targeted at specific markets. The relational databases' superior flexibility enabled the service providers to easily alter their rates and bundle services by changing values in the data tables.

The telecom example highlights legacy systems' key advantage. They are still the best platform for running hierarchical databases, which are still the best platform for large-scale transaction processing. Utilities and exchanges, for example, are unlikely to implement transaction processing systems based on an RDBMS because their fundamental structure doesn't lend itself to massive transaction processing volumes. Data fields in relational databases can have a dozen attributes linking them to other data fields. That much variation creates opportunities for errors and makes processing slower than that of a legacy system. The legacy systems' simpler data path -- with only one common attribute between data fields -- is better suited to processing large transaction volumes.

Those same qualities, however, make legacy systems a poor choice as a front end. The hardened code that supports fast processing doesn't change quickly enough to accommodate changes companies have to make in their interfaces to keep up with their markets. For example, a hierarchical database would be a poor choice to support an interface that includes search functions based on several variables. The table-based structure, with its multiple associations between data sets, is better for the ad-hoc data queries performed by analysis applications. With the relational databases providing querying capacity and the legacy system keeping the authoritative data, the two are an effective combination.

New Doesn't Always Equal Better

In the Internet banking and telecom examples, companies implemented hybrid legacy/server architectures because they had no alternative. Fifteen years later, however, the range and quality of server-based systems has grown exponentially, IT shops do have choices, and that brings us back to the original question -- why retain an enterprise system when newer alternatives are available?

As the banking and telco examples demonstrate, enterprise systems cast in the proper role can deliver significant value, perhaps indefinitely. The only exception is when the vendor discontinues maintenance and support. That is the only reason for retiring a legacy system. No system, legacy or contemporary, regardless of its performance, can run profitably long-term if the company's IT organization has to maintain it without vendor support. The pool of technologists with legacy technology skills is shrinking, and although there's no evidence to suggest it's shrinking faster than the demand for such skills, it would be difficult for a company to maintain its legacy skill base without vendor education programs.

This issue aside, there is no sound reason to reflexively abandon legacy technology. At their best, enterprise systems are still among the most secure and robust elements in the data center. Through successive waves of client/server and Web-based computing paradigms, most stock exchanges and credit card companies have continued running their transaction processing systems on 20-year-old custom code on legacy hardware platforms. As long as they continue delivering value, the costs and risks of wholesale legacy system replacement make for a poor strategic decision. Pairing legacy systems with server-based systems that have complementary strengths extends the life of legacy systems -- not just for the sake of preserving them, but for preserving the value they continue to produce.


Greg Enriquez is senior vice president, worldwide sales and field operations at Stratus Computer, in Maynard, Mass. You can reach the author at