Datacenter Server Application Certification

The Windows2000 Datacenter Server logo will be without question the most difficultMicrosoft application certification for ISVs to obtain.

A closereview of the hoops an ISV must leap through to earn certification shows thatend users who buy Datacenter-certified applications can count on getting a veryreliable application that behaves well in the locked-down Datacenterenvironment. IT buyers, however, should be aware that the certification processis not designed to indicate how well an application will perform on Microsoft’smost scalable platform.

WithWindows 2000, Microsoft introduced a new application certification program.Unlike previous Windows NT logo programs, the new certification program is ahighest-common-denominator approach that designates application vendors whohave specifically written applications to behave well in Windows 2000environments.

SinceMicrosoft and testing partner Veritest unveiled specifications in 1999, dozensof applications have earned the logo for Windows 2000 Professional and Windows2000 Server. The Certified for Windows 2000 Advanced Server designation gracesa handful of applications.

This fall,Microsoft released Windows 2000 Datacenter Server, its high-end operatingsystem with such an emphasis on support that Microsoft changed its distributionmodel. To compete with service-rich Unix and mainframe systems, Microsoft onlysells Datacenter Server through OEMs whose systems pass a 14-day stress testand who promise to offer Datacenter Server customers 24x7, high-prioritysupport (see info box for a list of available systems).

For thisultra-locked-down environment, Microsoft introduced a similar certificationprocess for Datacenter Server applications: vendors must prove theirapplications can handle the high-end features of Datacenter Server and thatthey are ready to support the high-demand customers the systems will ship to.

Microsoft’sSQL Server 2000 database was the first application to pass the new test, andthe only application to have done so by late December.

To earnDatacenter certification, an application must first pass all the requirementsof Windows 2000 Server and Windows 2000 Advanced Server certification. Then anapplication must exploit the distinct features of Datacenter Server. It mustinstall and run on a 32-processor server, run in memory above 4 GB, install andrun in a four-node cluster configuration, and run correctly under Job Objects.

Then anapplication must prove ready for the demanding environments expected in thedata center by surviving two 48-hour stress tests and making debugging symbolsavailable. Finally, there’s a test for the vendor, who must document a 24x7support structure ready to plug into the single-point-of-contact supportinfrastructure Microsoft set up through the Windows Datacenter Program.

The memorytest is an area where the certification specification won’t tell users how wellan application takes advantage of Datacenter’s capabilities.

DatacenterServer features kernel enhancements, Physical Address Extension (PAE), thatallow it to take advantage of all physical memory installed on a machine. Thisfeature enables Windows to run on up to 64 GB of RAM, smashing the 8 GB limitof Windows 2000 Advanced Server and the 4 GB limit of Windows 2000 Server andWindows NT 4.0 Server.

There aretwo ways Datacenter lets applications use this memory. One way is for theoperating system to assign the applications to memory blocks above 4 GB soseveral applications can exploit large memory spaces without starving otherapplications on the same machine of memory. But Datacenter also provides a wayfor a single application, such as a large database, to use nearly all of thememory for extremely high performance through the use of a new Microsoft APIcalled Address Windowing Extensions (AWE).

To earn theDatacenter certification, an application only needs to show that it can handlebeing assigned to high memory, not that it is written to exploit the fullmemory capabilities of the operating system. If a vendor does choose to useAWE, documentation must be provided showing system administrators how to setlimits on memory usage.

In terms ofprocessor support, there, too, are limitations to what the certification logoon an application signifies.

An ITmanager gets some help here from the specification, which differentiatesbetween utilities and scalable applications. Vendors of utilities, such asanti-virus software, only need to show that the application installs and runson a 32-processor machine. Scalable applications, such as databases, messagingsoftware, virtual machines, or modeling software, must make use of all theprocessors in the server, be it an eight-processor server or a 32-processorserver.

The specificationis clear that it is not important how effectively an application takesadvantage of those processors.

“Loadbalancing is a best practice and not a requirement,” the certificationspecification reads. “In other words, if more than four processors arepartially used, but one is carrying significantly more of the load than any ofthe others, then that would not constitute a failure.”

Tests forclustering in the Datacenter specification look pretty robust. Applicationshave to install on two, three, and four nodes in the Windows 2000 DatacenterServer test.

Thisrequirement reflects one of the differences between Advanced Server andDatacenter Server. Microsoft supports two failover nodes on Advanced Server,and up to four nodes on Datacenter Server.

Microsoft’sspecification also says applications cannot assume how many nodes are on asystem. An ISV cannot certify an application for Datacenter Server if it onlyinstalls on a four-node cluster or a three-node cluster. The application mustsupport any number of nodes between two and four.

Like therequirement for n-node support, ISVs are expected to be flexible in the wayapplications fail over. When a node fails, any node must be able to pick up theapplication: the software cannot specify which node will perform the new dutiesin place of the failed node.

Because ofthe way Windows handles storage sharing, Microsoft had to add an additionalnote to this requirement. Cluster nodes cannot share ownership of storage, suchas a NAS device. If the node owning a device fails, ownership of the devicemust transfer to another node so the application can keep running.

Clientsmust also survive failure of the server application without crashing oraffecting the stability of the system.

Thisrequirement pertains to software running on machines other than the DatacenterServer machine. For example, if a database node fails in a cluster, Microsoftexpects the client application’s inability to receive data will not cause theapplication or client machine to crash. The specification suggests that ISVsinclude a “not available” message for this event.

Theclustering portion carries over into the stress testing. Veritest puts theapplications through two different 48-hour stress tests. One occurs on afour-node cluster of eight-processor Compaq servers, and includes multiplefailover conditions. The other stress test occurs on a Unisys 32-processorserver, which also features normal and peak user loads. The application mustremain stable for the entire two-day, unattended test period.

Theapplication must also operate correctly under control of Job Objects, andconversely an application must allow itself to be controlled by Job Objects.

Job Objectsperform a similar duty as a Job Control Language on a mainframe. The feature controlsprocesses to prioritize and queue processor requests, allowing administratorsto ensure that important, mission-critical applications receive priority by themachine.

If anapplication crashes while under Job Object control, the feature and the applicationwould be of little use to an administrator running a back-end system. Inaddition, the last thing an administrator wants is anti-virus software, forexample, taking priority on a machine and slowing down a database.

The testalso requires the availability of debugging systems. This requirement alsoreflects Datacenter Server’s focus on high availability. Microsoft wantsadministrators to be able to give detailed problem reports to the softwarevendor in the event that a crash occurs.

Microsoftsuggests that ISVs offer debug symbols when the application is installed, sowhen a system crashes or is about to crash, administrators have information topass on or that will help fix the problem. Like applications for other Windowsplatforms, Microsoft assumes the appropriate debug symbols will appear in apop-up window when an application fails.

More thanwith any of the other Windows 2000 logos, the Datacenter platform is the onewhere certification makes a significant difference.

Toolvendors who pursue this logo indicate a level of commitment to the platformthat is a must for enterprises who need to run a rock-solid system. NetIQ, BMC,Veritas, and Computer Associates, for example, have publicly stated thatthey’re seeking Datacenter certification. Any tools vendor serious aboutselling to Datacenter customers should get in the testing line.

Fordatabases, certification is a case-by-case question. Microsoft had to be firston this one with SQL Server because the company had to prove its WindowsNT-only database could scale across more than eight processors. The SQL teamhad to develop a hotfix to get the database to scale past 20 processors on thetest, so it turned into a valuable development exercise.

Expect tosee IBM get its DB2 certified soon. Big Blue’s DB2 was among the firstapplications to earn the Windows 2000 Server and Advanced Server logos. Oracle,on the other hand, has promised not to seek certification. The company iscommitted to supporting its database on the platform, but Oracle hardly needsto prove to anyone that its code scales. Not seeking certification is more anissue of marketing and pride than anything else.

Ultimately,IT buyers looking at scalable applications on a 32-processor, 64-GB machinewill still need to pose two questions that the certification process doesn’tanswer: Does the application take advantage of all the memory that’s available,and what work has been done to balance the workload across all the processors?

In the end,a Datacenter logo on an application should carry a lot of weight in the Windows2000 Datacenter Server space, where quality control is key.


[Infobox]Certified Datacenter Hardware: Single Systems

Completesystems -- server hardware, operating system, storage -- undergo intensivescreening, including a 14-day stress test, before Microsoft allows OEMs toresell them with Datacenter Server. The following systems are approved for salewith Windows 2000 Datacenter Server as of late December:

CompaqProLiant 8500 Datacenter Solution

DellPowerEdge 8450/550 MHz

DellPowerEdge 8450/700 MHz

FujitsuPrimergy N800

FujitsuSiemens Primergy N800

Hewlett-PackardNetServer LXr 8500 DC 700

IBMNetfinity 8500R Datacenter Server

IBMNetfinity 8500R Datacenter Server w/ EMC Symmetrix 5.0

ICL Trimetrae.server2000 level 7.1

Unisyse-@ction ES7000 Plateau 7.1



CertifiedDatacenter Hardware: Four-Node Clusters

CompaqProLiant Cluster HA F500 (ProLiant 8500 four-node Data Center Solution)

Hewlett-PackardNetServer LXr 8500 DC four-node 700 (eight processors) / HP SureStore E DiskArray XP256 / HP D8602A


CertifiedDatacenter Hardware: Two-Node Clusters

Unisyse-Action ES7000 with 16 CPU/server and Clariion FC4500, EMC Symmetrix 5.0, EMCSymmetrix 4.8, or ESM/OSR/CSM 7800

Microsoft Corp., Redmond, Wash.,

Veritest, Los Angeles,