Addressing Y2K on the Desktop

While IT organizations feel confident that they have averted the Year 2000 problem, it appears as if Y2K problems on the desktop have been largely ignored.

With less then seven months, IT organizations will be up against one of the biggest challenges that has faced our industrialized society.

While IT organizations may feel confident that they have the Year 2000 problem averted by staffing project teams to deal with Y2K date fields in their legacy mainframes and servers, it appears as if Y2K problems on the desktop are being largely ignored.

An alarming atmosphere of false security exists, driven primarily by assurances of Y2K compliance by desktop vendors and systems providers. The harsh reality, according to Stephanie Moore, Senior Industry Analyst, Giga Information Group, is that roughly 50 percent of the PCs on desktops today are not Year 2000-compliant. In addition, a recent industry article suggested that "correcting Year 2000 problems at the PC level makes up more than $90 billion of the estimated global problem."

Is it any wonder that many IT executives shrink at the thought of their Y2K desktop problems? In many organizations, especially those that have been successful and have experienced rapid growth, the acquisition of PC hardware and software has proceeded in a relatively disorderly fashion, without the centralized management and standards-setting normally applied to the acquisition of technology for the data center or enterprise network. Simply finding all of the PCs and figuring out who owns them and what software has been installed on them is a monumental task – let alone visiting all of the desktops to update or fix them.

What kind of approach should be taken to solve the Y2K problem at the desktop?

First of all, it should address the organization’s short-term need to remediate existing desktops. This is complicated somewhat by the fact that five different levels of Y2K desktop compliance are involved – namely the BIOS, operating system, network operating system (e.g., Novell NetWare or IntraNetware), applications (both custom and shrink-wrapped) and user data files, such as spreadsheets and databases.

Secondly, IT organizations should acknowledge that not all desktops can be economically remediated. Many older desktops are simply not capable of running 32-bit operating systems like Windows 95 or 98 or Windows NT – due to limited processing power and memory space, and Windows 3.1 simply won’t cut it after the year 2000. In fact, a recent industry article stated that "As far as the government is concerned, there is no such thing as Windows 3.1."

For many organizations, the "bulldozer" approach may be required. This concept, first advanced by Dell Computer in describing the approach taken by Eastman Chemical, a multibillion dollar chemical firm headquartered in Kingsport, Tenn., involves "bulldozing" many or all existing PCs rather than attempting to upgrade the numerous types of hardware platforms and hundreds of distinct software applications typically found in decentralized organizations.

Instead, the organization chooses to standardize on one to three hardware platforms (preferably from a single supplier), and selects a core set of standard software applications that will be supported by IT. This "bulldozer" approach is contrasted with the upgrade approach, which can be likened to moving a 40-foot pile of sand one shovelful at a time. This typically involves higher costs and requires significantly longer time to accomplish.

Organizations need to recognize that theY2K desktop issue is not a one-time fix. Software vendors are constantly discovering new Y2K problems and issuing Y2K patches and updates. For example, Microsoft has already issued Y2K updates for Windows 98, Internet Explorer and Windows NT, and hardware vendors are constantly upgrading their BIOS systems (typically 3-4 times per year). The chances are high that you have received, or will be receiving Y2K updates several times between now and the year 2000 – and perhaps beyond.

Therefore, IT organizations need a technology solution that provides an automated solution to the three approaches described above, without burdening end users or requiring technicians to visit every desktop. This ideal solution should:

Test and fix existing PCs at all five levels. For example, PCs that require a BIOS Flash update should be updated over the network – a problem not typically addressed by traditional login script-based solutions or conventional Electronic Software Distribution (ESD)-based solutions – since these typically require a Windows operating system and network stack to operate while Flash BIOS programs are run from the DOS command line, using a floppy-based program.

Similarly, the process of installing complete operating system upgrades (e.g., from Windows 3.1 to 95 or NT) is also not typically addressed by login-based solutions or conventional ESD approaches.

Automate the process, rolling out new Y2K-compliant PCs. This includes formatting and partitioning hard drives, installing and configuring operating systems and installing and configuring applications.

This should not mean "cloning" all PCs so that everyone’s machine looks the same. End users still desire and need a certain degree of customization, based on their line of business or personal requirements, and automated solutions should accommodate this efficently, without extensive administrative overhead for each customized installation. However, some level of standardization is recommended since it results in significantly lower ongoing support costs and a higher level of IT service quality, compared to previous ad hoc and decentralized approaches.

Provide a "continuous configuration infrastructure" that supports rapid deployment of current, as well as future Y2K updates. The bonus of this approach is that the same infrastructure should also support future updates of any kind, such as updates to Office 2000, Lotus Notes R5 and Windows 2000, as well as future initiatives, such as support for the new Euro currency and new mission-critical e-commerce initiatives. It is also a powerful mechanism for re-centralizing IT control over distributed desktops in order to deliver the true productivity benefits of network-based software configuration and deployment.

An IT organization’s Y2K strategy and solution is undoubtedly the most difficult, imperative and timely plan of attack that will be carried out during the organization’s lifetime.

Critical technology factors must be addressed in order to solve the problem in a timely, efficient and cost-effective manner with advanced technology that also benefits the IT organization and its end users into the future, long after the Y2K experts have gone home.

About the Author: Phil Neray is Director of Product Marketing at ON Technology (Cambridge, Mass.).