Ruling the Wintel World

The upcoming availability of platforms built on Intel Corp.’s 64-bit Itanium processor could bring a fundamental shift to the PC workstation and PC server worlds. For what use and when will 64-bit Wintel computing be important? Over the next year or so, 64-bit computing will likely be adopted for specialized, high-end applications only. For general purpose use, a long-term adoption trend will be driven by factors such as purchase price and investment protection.

Microsoft Corp. has been uncharacteristically quiet about 64-bit computing. This is probably because of concern about confusing the Windows 2000 launch message. Insiders say 64-bit versions of Windows 2000 were being built on a regular basis during the later phase of 32-bit Windows 2000 development, and will be ready to go when Itanium systems start shipping. Packaging of 64-bit Windows 2000, on the other hand, is something the company is not discussing.

One could make an argument that, for Microsoft, this transition will be similar to the shift from the 16-bit architecture used by Intel 8086 processors to the 32-bit architecture used by more modern members of the x86 family. But there are some fundamental differences in market demands on the software side that drove the 16-to-32-bit shift and what might drive a 32-to-64-bit migration.

By the mid-1980s, the limitations of 16-bit environments became painfully obvious. One of the most nagging problems was 16-bit memory management, which imposed a 64 KB memory limitation. There was a period of intense activity in the industry to develop workarounds, and an industry consortium was formed to develop standard memory expansion technologies. The message was clear: software needed more hardware.

The availability of 80286 and 80386 systems allowed extended memory to be added, and Microsoft developed some kludgy and problematic patches to allow programs to be relocated to and run from high memory. It wasn’t until Windows NT and Windows 95 that the memory frontier was pushed back to today’s 4 GB limitation.

Of course, there were lots of other less obvious benefits that the transition to a 32-bit brought to the table: wider system bus structures, the ability to move data around within the system faster, and more modern I/O bus structures.

Today there are some applications that are memory-constrained in a 32-bit Windows environment. But this time Microsoft and Intel are ahead of the requirements curve. For instance, the NT File System has always been based on a 64-bit design. In Windows NT 4.0, Enterprise Edition, Microsoft introduced something called 4 GB RAM Tuning, which limits the operating system to 1 GB and makes the balance available to applications, as opposed to a 2 GB/2 GB split. With the new Pentium III Xeon systems, Intel included something called 36-bit Physical Address Extension (PAE). When exploited by Windows 2000, PAE allows uniform access up to 64 GB of physical memory. Unlike the 16-to-32-bit transition, fundamental hardware limitations that cause software problems are not the core driver for an upgrade.

Another parallel to consider is the difficulty in getting users to upgrade current applications. While it’s true that most 64-bit operating systems offer 32-bit compatibility modes -- Windows 2000 included -- users won’t reap the full benefit of the 64-bit environment until upgrading applications. This presents a marketing problem for ISVs: For many applications, moving to a 64-bit compile will offer few -- if any -- compelling improvements.

More than a few users still have ancient 16-bit DOS applications in use, which illustrates the difficulty of legacy software. Getting people to abandon functional 32-bit applications and pay for an upgrade will be even harder.

This means ISVs will be faced with the choice of supporting two versions of applications, or sticking with 32-bit code until a 64-bit environment is the preferred alternative. This is a chicken-and-egg scenario if ever there was one. In fact, the only server operating environment to successfully shift its entire application portfolio over to 64-bit is IBM’s OS/400, and that only happened because of IBM’s unique architecture. Windows 2000 applications will have to be upgraded manually.

The likely candidates for 64-bit computing in the Wintel world will be high-end servers -- particularly database servers -- that face resource/performance tradeoffs; and professional workstations that are used for modeling, computer-aided design, and electronic design. The rest of the Wintel world will probably be content to continue to play the "wait a few months for a faster processor" game.

Until hardware prices make a 64-bit operating environment the logical choice, 32-bit computing will continue to rule the Wintel world. --Al Gillen is research manager for infrastructure software at International Data Corp. ( and former editor-in-chief of ENT. Contact him at