Windows 2000: A Shift in Thinking for Microsoft

Microsoft Corp.’s query, "Where do you want to go today," is an offer made to its many customers. But after spending a day on the Redmond campus recently, attending briefings with BackOffice and Windows 2000 product managers, it appears the question need not apply in house. Microsoft already knows where its enterprise computing initiatives need to go.

After numerous delays in the Windows 2000 schedule, product groups are gearing up for the launch later this year. Indications are that Microsoft’s internal schedule is starting to firm up. A Microsoft spokesman also confirmed that Beta 3 -- which happens to be Release Candidate 1 -- will go out the door this month.

Product groups have sorted out their plans to roll out upgrades, where it makes sense, shortly after Windows 2000 ships. SMS, for instance, will see another release follow the 2.0 product that only recently started shipping. The major difference is that the next release -- code named "Emerald" but without a probable version number -- will include Active Directory integration. Otherwise it won’t be a major upgrade to the existing tool. On the other hand, SQL Server 7.0 won’t be seeing a dot release specifically influenced by Windows 2000.

I was particularly relieved to learn about some of the details for Microsoft’s reliability strategy for Windows 2000 (see article on page 1). It’s been clear to the buying public for some time, and apparently to Microsoft, that Windows NT 4.0 simply is not reliable enough to take on the most important computing assignments in corporate America.

Microsoft wants very badly to play in the mid- to upper-end of the distributed computing space, but to play there means fielding a machine that is as rock-solid and reliable as the systems it must compete against. That means being measured against benchmark systems such as AS/400s, OpenVMS systems and Unix variants. The argument of a favorable price/performance ratio is compelling, but that metric goes out the window when it comes to mission critical applications that must be up and running without failure. So far, Windows NT has come up short on the reliability side of the equation.

Microsoft may be proud of its Windows 2000 daily build cycle, which includes the testing of thousands of systems and even more combinations of peripherals, but that’s not the only thing it needs to do to achieve higher levels of reliability. An important step in addressing this problem is reducing the number of potential configurations that users may deploy.

Instead of the anything-goes mentality that still exists in the PC space, Microsoft is adopting a more disciplined approach to configuring high-end Windows NT systems. The bottom line is that Microsoft is scaling back the subsets of components that will be supported in high-end versions of Windows 2000. Datacenter Server will include an even smaller set of supported hardware devices and drivers than will Advanced Server. While this may limit configuration choices somewhat, it’s the kind of enterprise-oriented approach that Microsoft has needed to use.

Computers like the AS/400 have long offered only a limited subset of peripheral components. While many users perceived that as a negative associated with the AS/400 that drove up hardware costs, I never heard anybody complain about the reliability of that machine.

The bad news is that these reliability initiatives are all built around Windows 2000. At this time, Microsoft is not working on a similar initiative for Windows NT 4.0. That means for all the "legacy" systems still running Windows NT 3.51 or 4.0, you’re on your own -- or you need to tap one of the 99.x up-time programs offered by Data General, IBM and others.

Users have voiced their concerns about reliability, and it appears that Microsoft is seriously trying to address them. How successful the company is in rising to this challenge has yet to be seen. At least Microsoft knows there’s a challenge to be met, and it wants to go there today.

Must Read Articles