Cleaning the Enterprise Attic: Software Relics Cost Money

Mainframe software asset management has been ignored by many research groups and IT managers.

Because today’s business enterprises address their computing problems with cross-platform solutions from a variety of suppliers, it has never been more difficult to monitor and control IT assets. As variables multiply, systems become more complex and the pace of technological change quickens, the problem of out-of-control assets deepens. Enterprise-wide asset management — monitoring inventory and usage and administering vendor relationships — is the key to regaining control and containing costs, whether an application resides on the mainframe, a UNIX or NT server, or a desktop PC.

Although far-flung hardware — the fruit of distributed computing — gets a lot of attention these days, software — especially mainframe software — represents an even bigger problem. META Group reports that while the hardware portion of IT budgets has gone from 39 to 13 percent, software has climbed from 12 to more than 35 percent. Fully 70 percent of all corporate data lives on the mainframe — processed by some 300 billion lines of application code in the United States alone. Its estimated value is $6 trillion — enough to reconstruct, building by building, 15 of the world’s largest cities.

This is infrastructure that can’t be ignored. Most of it consists of homegrown applications that would be too costly to move to other platforms. Much mainframe software is also the backbone of a company’s competitive advantage — and all of it is being analyzed for Y2K compliance. Moreover, the mainframe’s inherent reliability, security and speed position it for a central role in the burgeoning areas of electronic commerce and ERP.

The problem facing IT managers is that the mainframe’s valuable assets have a tendency over time to become disorganized. What software resides there? Where is it located? Who is using it and how often? Is it up-to-date? These are all key questions for managers who must ensure that resources remain aligned with evolving organizational goals.

An IT organization loses sight of inventory in various ways. Personnel changes, data center relocations, mergers, downsizing — any number of commonplace business events — can muddy a corporation’s resource profile. Ever more powerful processors have driven installation of new or upgraded software and the development of increasingly sophisticated in-house applications.

But as markets and technology change, applications become obsolete, redundant or simply forgotten. Executable code may become disassociated from source code. An enormous stockpile of unused and unwanted software develops, much of which the organization continues to maintain and pay for. Without the right auditing software, sorting through it all and deciding what deserves to be kept can be a burden. Often, it seems easier to maintain all the application code rather than figure out what’s no longer necessary.

Year 2000 preparation challenges every data processing organization, and what to convert is a key question. Properly sizing and defining the problem is critical at a time when the SEC requires public companies to publish details of their Y2K plans in annual reports. At Citicorp, for example, it has been estimated that the conversion will cost $300 million.

Without an accurate picture of the company’s assets and their current role in decision support, factors such as disaster recovery planning, configuration changes and CPU consolidation suffer. How, for instance, does one determine the effect of system modifications on the rest of the organization?

Without usage information pertaining to their licensed software, IT managers are unprepared to negotiate a favorable deal during license negotiations — and license compliance becomes impossible. Confusion about what software is licensed for which system and what versions are being used can expose an organization to lawsuits. With squeezed profit margins, software vendors are enforcing licenses more strictly than ever before.

Best practices dictate that major DP outsourcing firms regularly audit client inventory. Al Lessig, Manager of Computer Sciences Corporation’s software assets in North America, points out that the company "regards software asset management as a critical program in our overall business strategy. A regular program of inventory and usage reporting keeps our customers' financials in sync with their physical inventory."

Managers need to understand where they’re spending their precious budget dollars and how to stretch them. It’s vital that they control their software assets today to plan for tomorrow’s growth. Hundreds of thousands of dollars are wasted each month on license fees for unneeded software. With all the business challenges that every organization faces, that money can be put to better use.


Robert Barritz is President, CEO and Founder of Isogon since 1983. He has previously worked for IBM’s System Development Division, where he helped develop the early releases of the operating system now known as OS/390.

Back to Article