Feature: Y2K Threat Forged a New IT
After spending billions of dollars and working long hours into nights and weekends, what does the IT industry have to show for all of its Y2K efforts?
It depends. For some IT managers, it was a job with an impossible deadline that was squeezed in between other projects. For others, it opened a new dimension to IT management.
"Y2K was a catalyst or opportunity for other IT improvements," says Leon Kappelman, associate director of the Center for Quality and Productivity at the University of North Texas (www.unt.edu). "Y2K was a watershed. Everyone knows IT counts now, and that little things mean a lot when lots of little things are involved. They also know how to more effectively communicate about IT, from both sides of the monitor. Equally important, new standards have been set for cooperation and communication, and new mechanisms established to facilitate it."
In a way, Y2K served as the industry's first "information technology stand-down," says Bruce Anich, senior management consultant at Robbins-Gioia Inc. (www.robbinsgioia.com), an IT consulting firm. "For two years, everybody focused on remediating their legacy systems, and put off bringing new information technology online. From a macro view, the stand-down brought corporations up to a level playing field."
A Management Problem
Y2K prompted one of the world's largest organizations, the U.S. government, to instigate interagency communication on a massive scale, not to mention getting its systems house in order.
"Governments have complete inventories of resources and systems for the first time," says John Koskinen, chairman of the President's Council on Year 2000 Conversions. "Major online testing environments for critical new systems have been built in many governments. These, and new project management techniques developed for the complexity of Y2K, will pay major dividends when new systems are undertaken in the future."
One branch of the government, the U.S. Navy, reports a heightened appreciation for IT priorities as a result of its concerted Y2K effort. "We had to view this as much more of a management problem than an IT problem," says Dave Wennergran, deputy chief information officer for the Navy. "Y2K involved organizations from throughout the Navy department, from weapons systems to mainframes running COBOL code." Wennergran estimates that his task force looked at about 2,000 systems.
Early in the process, there was consternation that Y2K was an IT problem that only involved IT people. "Actually, it affected every aspect of the way we do business," Wennergran says. "We recognized that information technology, information management, and knowledge superiority, are integral components of every aspect of the way the Navy and Marine Corps operate.
Part of the Y2K awareness process included televised forums that involved some of the Navy's top brass, an idea that may be applied to future IT challenges, such as information security.
"We needed not only to make sure that we were functioning seamlessly into the New Year, but also to develop a robust set of lessons learned from Y2K," Wennergran says. "How does this affect the way we do information assurance, how does this affect the way we manage our information systems, how does this affect just the way we go about dealing with complex management problems? Through a robust testing effort, we learned a lot about ourselves."
A More Disciplined Approach
Y2K was certainly a top priority for high-technology businesses. Hewlett-Packard Co. (HP, www.hp.com), for example, had a dual challenge: address its own internal business systems and double-check its product lines.
Fortunately, from a technical perspective, the product-side remediation was quick and easy, says Brad Whitworth, Y2K communications manager at HP. "It was a pretty simple process. More than half of our products -- such as printers -- don't even process dates."
The management side of the project was a bit more interesting interesting, though. "Y2K became the largest single outreach to customers that we have ever done," Whitworth explains. "Every part of HP's businesses needed to reach out to an existing base of customers to make sure they were checking their systems."
The most valuable lesson HP gained from its Y2K experience is that an inventory of software assets is invaluable. "Companies historically have done a decent job of looking at hardware, and thinking about things like how many PCs, how many servers, and upgrades," Whitworth says. "Y2K forced a lot of people for the first time to create an inventory of their software assets."
For many organizations, the knowledge base coming out of their Y2K efforts will be valuable for future IT endeavors. "We had to keep track of all this work that we were doing," the Navy’s Wennergran says. "For the first time, we had a database of all the systems that we care about, and all of the infrastructure devices that we have. We began to understand what we had, what worked together, and what interfaces these systems had."
Regular inventories of software enabled HP to "get better use from the software itself," Whitworth says. "Regular software tuning and upgrades make a lot of sense. If it hadn't been for Y2K, we probably wouldn't have done any revisions. Now we know we need to do it periodically." For example, Whitworth reports that his team came across a program that took two-and-a-half hours to run. "After going back and reworking it, it now runs in 11 minutes."
Many companies adopted a more disciplined approach to software auditing, as confirmed by recent research by Vector Networks Ltd. (www.vector-networks.com). The survey shows that more than one-third of IT managers believe Y2K had a dramatic effect on their auditing policy. In fact, more than 75 percent now have a formal auditing policy.
"The concerns over the millennium bug certainly sharpened IT managers' attention to the importance of auditing," says Colin Bartram, product marketing director at Vector Networks. "Issues will always arise that require IT managers to be able to audit their PC resources quickly and efficiently. For example, a new virus will mean ensuring that everyone has the latest anti-virus software installed immediately, and proactive measures are taken straightaway if necessary."
Y2K demanded that many organizations map their "cybergeography," says Robbins-Gioia's Anich. "One of the take-aways from Y2K is that firms have a greater understanding of the business systems that drive their firm. They've taken an internal look at their central IT nervous system, and physically mapped out and remediated it. They've also gained a greater understanding and awareness of the global complex network of linkages for their business."
Windows and Distributed Systems
When it comes to distributed systems such as Windows NT, the main lesson was that "Y2K really highlighted how little control we have over our desktop environment," says Jerry Ciesla, program manager with California Casualty Co. "That includes the software that's out there, either being used for production or being used to amuse people."
For the U.S. Navy, monitoring and remediating Windows NT and other distributed systems was too much for its centralized task force. "When you have an organization with 800,000 people across 20 time zones, you have to rely on individual system owners and base owners to take care of their issues," the Navy’s Wennergran says. "There's no way that one small group of people in Washington D.C. could really make sure that everybody's software was ready. Each manager had responsibility for certain aspects of this very large enterprise. They had to make sure that they understood what versions of software they had, and that they were getting the right upgrades. Information sharing was very important."
Although execution was decentralized, the Navy's master plan needed to be highly centralized. "We needed a central plan," Wennergran says. "That's where the Navy CIO, the Y2K project offices, Chief of Naval Operations staff, and the Commandant of the Marines Corps staff worked together with folks from throughout the Navy to come up with a standard strategy for approaching Y2K."
Many glitches the Navy found during its testing process turned out to be interoperability issues rather than Y2K issues. "For every Y2K problem, there were four or five other kinds of interoperability issues that we found, were able to work on, and make better. We're going to use this methodology now and into the future, as we go into further interoperability testing," Wennergran says.
Many of these interoperability tests covered systems that connected across the entire department and other branches of the service.
California Casualty's Ciesla believes the success by many companies in meeting the Y2K challenge illustrates how well IT performs when it has a clear-cut mission and deadline. "It looks like IT, when the project is clearly defined, can get things done." Unfortunately, many times IT managers have to engage in projects without such a clear mandate and timeline, he explains, and the project suffers.
Ciesla's operation at California Casualty, which runs a range of systems to support insurance and policy issuance operations, ran smoothly during the New Year's rollover. The firm employed a host of tools, including Foundation/2000 from Client/Server Technologies Inc. (www.f2k.com). "To a lot of us, it all seems 'too easy' now," he observes. "Of course, it wasn't easy, and nobody wanted to touch Y2K projects with a 10-foot pole over the last two years."
Money Well Spent?
Some industry analysts claim too much may have been spent on Y2K. International Data Corp. (IDC, www.idc.com) labeled this overspending as a hype tax, brought on by media-driven fears and the threat of litigation. IDC estimates that U.S. companies spent a total of $122 billion on Year 2000 fixes, of which up to $41 billion was unnecessary.
California Casualty’s Ciesla says many companies spent too much on Y2K because they were throwing as much money as they could at the problem to make it go away. "Everything that we do is a stopgap, or a Band-Aid," he says. "Because if you do a project that's not a Band-Aid then that means it's going to last forever -- and that's baloney."
But to others, Y2K money was money well spent. "Technological improvements aside, the money was still well-spent if we got nothing more than the silver linings of Y2K that accrued to almost everyone involved -- such as skills, knowledge, and attitudes," University of North Texas’ Kappelman says. "We know we got plenty more than that anyway."
Kappelman points out that since the failure rate for all IT projects hovers around 25 percent, Y2K overspending was not beyond the norm.
"Of course, if Y2K was an opportunity to make other IT improvements for which there was a business case to make, then that is not waste at all," he says. "We'll never know with certainly what the actual cost was, or how much of Y2K spending went to other needed improvements, or what spending might really be considered actual waste, except maybe at the level of the individual enterprise. But one thing is for sure, there's plenty of waste in the rest of IT spending and it appears that all the Y2K spending netted out as a globally positive investment to clean up what was largely a really stupid mistake."
Some IT departments may have gone overboard in executing zero-tolerance policies toward minor software glitches. "If there is one lesson we can take away from Y2K it is that "adequate" is usually good enough," says Jon Huntress, associate producer of the Year 2000 Information Center (www.year2000.com). "There is an enormous amount of fudge in the system because of all the human interfaces. Systems and applications are much more fault-tolerant than we gave them credit for being. Five years working with Windows should have taught us this."
By nature, computer programmers tend to be perfectionists, Huntress says. "Computer programmers will hate this because they know their code has to be as near to perfect as it can get. This is the main reason why so many programmers were on the doomier side of the bench. They just applied their certainty of what a few mistakes in their own programs would do to the whole world of computers. But it didn't follow. There will be lots more glitches and problems that come up, of course, but they won't all happen at the same time and the IT staff will probably be able to handle them in most cases without too much overtime."
Kappelman says Y2K was a symptom of underlying quality and management problems that already existed. "As we clean up the remaining Y2K mess, staying ever vigilant about Y2K and all other system risks, it's time to make sure we keep the benefits gained from Y2K -- while learning from our losses and errors -- as we move onward and upward and really figure out how to use these remarkable technologies we've created."
In the end, Y2K demonstrated the fact that IT is an organizationwide resource, not a functional area or task relegated to one or two departments.
"One of the things we learned from the Y2K process was how pervasive information technology has become in our lives, and how it involves every aspect of the organization," the Navy’s Wennergran says. "When you have an issue like Y2K, it's a CEO focus rather than just a CIO focus."
Whether spending on Y2K was too much or too little, preparing for the bug has enabled companies to prepare for other challenges that some IT departments feel should be handled as carefully as Y2K. To many industry analysts and participants, e-commerce represents such a challenge, since it involves every component of the organization and its supply chain. To other large organizations that are increasingly relying on digital knowledge, information assurance is another formidable challenge. In a world rife with computer hackers and viruses, security needs the same level of urgency that Y2K received.
"The only difference [between Y2K and information assurance] is there's no countdown clock where it all of a sudden goes away. We really need to make sure that we have a strong sense of how to protect our systems and still function," Wennergran says.