Defragmenters Unite: Boosting Performance on Windows NT/2000

It is becoming increasingly well known among network managers that Windows NT/2000 performance can be enhanced by keeping files and free space defragmented. What isn’t so well understood, though, is the volume of gains available and the fact that defragmentation software may in many cases provide an economical alternative to an upgrade of system hardware.

"Defragmenters are rising sharply in popularity as people realize they can often deliver comparable performance gains to hardware upgrades at a fraction of the cost," says Paul Mason, Vice President Infrastructure Software Research at IDC. "This might be related to the apparently diminishing returns that hardware upgrades can frequently provide."

IDC conducted an in-depth study of defragmentation that is delineated in a report entitled "Disk Defragmentation for Windows NT/2000: Hidden Gold for the Enterprise." This report concludes that the performance gains available from defragmentation software can extend the life of hardware by as much as two full years, saving companies many thousands, if not millions of dollars, per year.

In order to establish a basis of comparison between hardware upgrades and defragmentation, IDC utilized corporate averages -- a hardware replacement every three years at a cost of $3,000 per workstation. With replaced workstations having a residual value of around $300 (at the end of three years), each machine cost the company around $900 a year. Other cost factors, however, were also entered into the equation. IDC estimated six hours per desktop replacement at $40 per hour for IT labor. Per detailed investigation, it takes about two-and-a-half hours to remove, and over three hours to install a new system.

For the other side of the equation, IDC used a popular enterprise defragmenter costing $49.95 per workstation and $259 per server license (list price). Due to the deployment of a defragmenter with centralized controls and automatic "Set It and Forget It" scheduling, IT time was computed at approximately two hours per month. Surprisingly, analysts discovered that it required the same amount of supervision to monitor 10 machines as 10,000.

Analysts calculated that defragmentation may defer hardware upgrade frequency, realizing considerable savings. "Effective use of defragmentation technology can produce comparable performance gains to costly system upgrades at a fraction of the cost," says IDC Analyst Steve Widen. "As the level of server and workstation deployment increases, the cost effectiveness of defragmentation increases exponentially." Although not stated in the report, IDC estimates that over $6 billion per year is wasted on unnecessary hardware upgrades purchased in an attempt to mask the performance impact of fragmentation.

Performance Gains

Performance testing was also conducted recently by National Software Testing Lab (NSTL). Real-world testing was done on both Windows 2000 and NT and upon completion NSTL stated, "Theoretical analysis and real-world performance testing demonstrate that fragmentation has an adverse impact on system performance. The best way to avoid these fragmentation problems, and to keep the system running at optimal performance, is to run a defragmentation program on a regularly scheduled basis."

NSTL carried out tests on various configurations running Excel, SQL Server 7.0, Outlook and Exchange. On a Pentium II 266MHz workstation with 96 MB’s of memory and a 2 GB IDE hard drive running Outlook and Excel, an increase in system speed of 85.5 percent was recorded. On a PII 400 MHz workstation with 228 MB RAM and a 4.2 GB HD, also running Outlook and Excel, defragmentation brought a performance leap of 219.6 percent.

On servers, the results revealed that a dual Pentium PRO 200 with 128 MB’s of memory on five, 4-gig SCSI hard drives, running RAID 5, Exchange Server and SQL Server showed an increase of 61.9 percent on a defragmented drive. On a Pentium PRO 200 with 64 MB’s of RAM, two 4-GB SCSI HD’s running Exchange and SQL, performance rose by 83.5 percent. Windows 2000 outperformed NT after defragmentation in every case.

Manual Defragmentation and Cost Effectiveness

Based on numbers like these and extensive user surveys, Microsoft decided to include a manual defragmenter inside Windows 2000. Known as "Disk Defragmenter," it functions somewhat like the defragmenters bundled with Windows 95 and 98. The utility is accessed via the Start menu, then click on Programs, System Tools and Disk Defragmenter.

The Windows 2000 manual defragmenter is more than adequate for the job of maintaining disk performance for the individual user. When it comes to network deployment, though, this tool is severely limited. It can only run one partition at a time, contains no priority settings, has no scheduling features and requires administrator privileges to operate.

"Disk Defragmenter is not intended to be a tool for administrators to maintain networked workstations," recommends Microsoft in Knowledgebase article #254564. "This version is not designed to be run remotely and cannot be scheduled to automatically defragment a volume without interaction from a logged-on user."

Although most companies will not likely consider this option to maintain their networks, the study also explored this subject, comparing expenses and TCO for using a manual versus a network defragmenter. Analysts allocated one hour to manually defragment individual server and workstation disks, taking into account the time it takes the system admin to go to each workstation to perform the task. Table 4 gives the total cost of manual defragmentation.

Unlike a manual utility, network defragmenters can schedule, monitor and control defragmentation throughout the enterprise from the system admin’s desktop. IDC discovered that centralized controls and automatic scheduling cut IT staff time demands to about two hours per month. This time was consumed in setting and adjusting defragmentation schedules and, regardless of the size of the network, it worked out at $960 a year in labor costs.

According to Widen, "Even though the actual numbers may vary from customer to customer, when considering the significant impact on TCO, it is difficult to find any argument to position manual defragmentation over network defragmentation."

Eliminating Performance Worries at UPS

As the word spreads about the IDC and NSTL findings, more and more companies are adopting this technology on an enterprisewide basis. At UPS, for example, defragmentation software is being deployed on thousands of desktops across the nation. "Regularly scheduled defragmentation makes data access much faster and prevents degradation of system performance," says David Zhao, Product Manager in the UPS Distributed Systems Services Division, responsible for NT design and infrastructure. "By eliminating disk fragmentation from the equation, network troubleshooting is much easier than before."

Zhao is a firm advocate of centralized scheduling. Instead of manually defragmenting each box, he remotely monitors and controls defragmentation of the entire NT network. "We schedule the program to run daily," he says. "That way we can set it and forget, and never have any more fragmentation performance worries."

When Zhao first loaded the software, though, the level of server fragmentation shocked him. Many files contained 4,000 or 5,000 fragments, and some hard drives were so heavily fragmented that they had to be repartitioned in order to restore performance. Such levels of server fragmentation, however, are far from uncommon. A recent survey by American Business Research on 100 companies revealed that 50 percent of NT servers had files that contained between 2,000 and 10,000 fragments. Another 33 percent discovered files fragmented into 10,333 to 95,000 pieces. Any server toiling under such a fragmentation burden will suffer a severe slowdown in system I/O.

Once these fragments are consolidated into contiguous files, users report faster access to documents, quicker reboots and improved backup times. "Defragmentation made applications come up much faster on my NT workstation," says UPS Systems Programmer Dennis Patti. "One compiler, for instance, that used to take a long time to load and open, now comes up rapidly."

Tallying Up the Losses

While the benefits enjoyed by UPS may be substantial, the total amount lost by medium and large size companies due to fragmentation is nothing short of staggering. As covered in the IDC report, analysts tallied up "hard money" losses stemming from lowered productivity, Help Desk calls, increased IT staff costs and unnecessary hardware upgrades. "IDC estimates that corporations are losing as much as $50 billion per year as a result of not defragmenting every server and workstation on the network," reports Widen.

Where does this number come from? When factored against the estimated 19.6 million worldwide corporate licenses that Microsoft shipped between 1997 and 1999, and approximately 30 minutes per user lost each day due to fragmentation, this comes to over $100 billion a year.

However, as workers can sometimes perform other tasks while waiting for applications to be executed, booting the system, etc, that total was halved by IDC. The report shows how desktop TCO can be reduced by $350 per year for the estimated 19.6 million corporate NT users currently in existence -- simply by instituting a defragmentation schedule across a network.

If these IDC calculations are even half-true, it places fragmentation on a par with viruses in terms of corporate devastation to productivity and the bottom line.

With close to 20 percent of corporate workstations already deploying defragmentation software, the day may not be far away when it becomes a desktop staple like virus protection.

"There are some things that every computer should have" says IDC’s Mason. "Just as you need virus protection to protect from file damage, every machine needs defragmentation software to protect it from performance degradation."

About the Author: Drew Robb is a Los Angeles-based writer specializing in technology issues.

Siemens Medical Reporting Capabilities Get New Life

Siemens Medical Systems, a leading developer and marketer of medical imaging devices, needed to meet a contractual obligation to produce more than a dozen crucial service and maintenance reports from its SAP R/3 system that details such items as contract revenue, costs, labor and equipment costs, miscellaneous costs, equipment inventory reports, open service call alerts, as well as closed job reports.

"The reports require us to pull information from all SAP R/3 modules, so they are very complex to produce," explains Tony Langford, Manager of Business Systems at Siemens Medical, and responsible for the company’s Windows 2000 servers.

The job was certainly getting done, as Siemens Medical was relying on programmers writing custom ABAP/4 code, SAP R/3’s programming language, to extract the necessary R/3 data for reporting purposes. Yet, additional reporting requirements were constantly cropping up -- requirements that could not be adequately addressed in this fashion.

"The reports represent detailed information by customer, generated in hard code format," Langford says. "We could not pull the reports using SAP R/3’s basic reporting capabilities, and recognized we needed to develop a longer-term solution as we add additional R/3 modules."

Langford further explains that ABAP reports are expensive to produce and cannot be customized by the end user. Plus, because R/3 data is very complex, all user access to information had to be facilitated by the IT people, a severe drain on the company’s resources.

According to VP of Information Technology & Services Jay Skibinski, Siemens Medical wanted to design and implement an enterprise reporting system that provided end user access to operational data from the various business applications and systems throughout Siemens Medical, including SAP, JDE and PeopleSoft.

"We also wanted to leverage packaged solutions for reporting needs that could be tailored to fit our specific requirements," Skibinski says. "We wanted to provide an intuitive interface for users to access pre-defined reports, as well as generate their own ad hoc reports from the information contained in the data warehouse/reporting system. Ultimately, we wanted to empower the end user."

To meet these requirements, Siemens Medical turned to two vendors: Acta Technology, the data warehouse industry’s first vendor focused exclusively on providing packaged ERP data warehouse solutions; and Cognos, with their PowerPlay OLAP tool and Impromptu query and reporting tool. Siemens selected the Acta RapidMart for Sales Analysis, the Acta RapidMart for Cost Analysis, and ActaWorks, the company’s ERP-to-data warehousing extraction, transformation and loading (ETL) tool, to help Siemens successfully develop and deploy an enterprise data warehouse and SAP R/3 reporting system.

Siemens is using Cognos’ PowerPlay for its multidimensional requirements (OLAP) and Impromptu for its standard reports and ad hoc queries. Of course, there was still the issue of a systems integrator to pull the project together. Plus, although Siemens actually installed SAP in October 1997, they did not install the service management module of SAP until June 1999. As Siemens was developing the service management module, they decided that a data warehouse would be the best place to create reports for that module.

These complex tasks fell to Exiom Technologies (formerly Root Consulting), a consulting services company with focused expertise in the areas of data warehouses and enterprise resource planning (ERP) applications.

"When we contacted Cognos and ACTA and decided to use both products, they said we could use their consulting expertise or go to a third party," says Lonnie Reiver, Data Warehouse Manager of Siemens Medical. "One company that had worked with both companies was Exiom." Exiom is, in fact, an Acta Works partner and a Cognos premium partner.

"We looked at quite a few factors," he continues. "Their proximity to us, which offered not only quick availability of consultants but work space to conduct training sessions on both Cognos and ACTA, as well as their extensive experience with data warehouses, were big factors." So, with software and hardware already in place, Siemens turned to Exiom for total project integration, start to finish. They needed data marts for various business units, and needed all user requirements fulfilled.

"Exiom looked at our business environment and reverse-engineered the rapid marts, after which they helped us build the data model," said Reiver. "The service management piece was all customized -- Exiom had to do custom work, determining what fields, tables, data flows were required. Then they helped us create reports off of that data."

Concurrently, Exiom was also training the Siemens employees so they could take over and maintain the system. "That was a challenge for us because ACTA and Cognos are used in the client/server environment, while many of the people on the project came from a mainframe environment," Reiver says.

Siemens Medical is using ActaWorks to extract SAP R/3 data nightly to refresh the warehouse before the start of the next business day. In its previous environment, custom ABAP programs would have been written to handle the extractions now performed in ActaWorks’ GUI-based "drag-and-drop" environment. According to Langford, the performance of the new system, although not formally measured, has been more than acceptable. "We’re running 52 data warehouse jobs nightly and easily getting through prior to the next business morning."

Acta’s Sales Analysis RapidMart provides an "out-of-the-box" warehouse solution that includes a target schema, source-to-target mappings and transformations that handle change data capture, hierarchy extraction, error recovery and other complex data warehouse processes. Siemens currently has more than 40 Cognos PowerPlay and Impromptu reports accessible through their intranet distributed to over 100 people. As far as benefits, having reports generated out of the data warehouse has eased the performance burden on the transaction processing system. "This would have cost us substantial dollars to purchase more hardware and beef up infrastructure," says Reiver. "We saved the wear and tear on the transaction system."

In just eight weeks after installing the Acta RapidMart and ActaWorks products, the major components of the key R/3 extracts were being performed and loaded into Siemens Medical’s target data warehouse. Just four months after product inception, Siemens Medical successfully deployed phase one of its data warehouse implementation, with two more scheduled for later in the year.

"At this point we’ve just scratched the surface of the reports we want to create," said Reiver. "Most are query type [using Impromptu]; the data warehouse is the platform that will allow us to build multidimensional cubes in PowerPlay for OLAP reporting. And, we will look to Exiom to help us implement these cubes."

"When we started the project, Siemens wanted to go live with SAP in three to four months," says Raghunathan Sankaran, the lead Exiom consultant. "At the same time, they also wanted to be ready with their DW, so that’s how it initiated. We started with a few sample reports, then started building a schema on that, then started customizing extracts in Acta, with their business users and data mapping."

Siemens Medical estimates that it saved six months and over $1 million implementing the Acta solution versus employing ABAP programmers to do the job. "That represents initial implementation costs only," Langford says. "We will also save in ongoing maintenance as well as the cost of producing new reports."

Must Read Articles