In-Depth

Storage Testing: Beyond Interoperability

Testing is absolutely required to ensure that storage “solutions” will actually resemble the pretty picture in the vendor brochure. But what's really needed is the ability to test storage products before you buy them—and under real workloads.

The absence of objective performance metrics gleaned from real world testing of storage products creates a dilemma for storage decision-makers. This problem has reached a crescendo in recent years as many storage vendors have gone their own way with respect to testing and measuring performance, often producing statistics better suited to marketing than to decision support.

Case in point: a NAS storage vendor appeared on the market in 2000 boasting a “more than 100x” performance advantage over all incumbent vendors. The company claimed to be a “disruptive innovator” poised to stand the industry on its head.

On closer examination, however, it was discovered that the vendor’s greatest “innovation” was its testing methodology. In a classic case of comparing apples to oranges, the vendor’s quoted performance numbers reflected the speed at which it could retrieve data from its memory cache, which it then compared to the advertised speed with which competitors could retrieve data from their back-end disk storage. Such a comparison would inevitably favor the vendor because of the comparative read speeds of RAM and disk respectively.

The neophyte’s marketing claim didn’t hold up for long under the scrutiny of this column or the consumer, of course, and some would argue that it showed a bit of naiveté on the part of the vendor. However, examples of similar marketecture abound from established vendors as well.

What is perhaps more misleading are the programs proffered by many leading storage vendors that represent certain combinations of gear (usually, their own, plus products from their “solution partners”) as “proven” or “certified.” Consumers need to pay close attention to the fine print describing such programs before giving the brands any credence.

An inquiry placed to IBM, for example, led to an interview with managers of the company’s “TotalStorage Proven” program. “TotalStorage Proven,” the managers conceded, is funded by IBM’s marketing budget and is primarily a vehicle for partners to receive an IBM “stamp of approval” for a configuration that includes their equipment operated in concert with IBM-branded storage.

The program offers two levels of approval—a “standard level” that means, according to one manager, that “the devices were plugged into each other and all the lights came on.” A second, “more comprehensive," level sees devices “plugged together for a longer period of time and some more extensive tests are performed to ensure that their equipment works with our on the box software.”

The managers were quick to correct the use of the words “certified” or “guaranteed” in connection with descriptions of the program: “We are very clear that we only certify or guarantee the performance of our own IBM hardware. Otherwise, support would be a big legal gotcha.” One doesn’t need a law degree to see their point.

The managers stated that, once TotalStorage-Proven status was awarded, “it is a judgment call as to the requirement to retest the solution when firmware (ours or theirs) is upgraded [or other potentially destabilizing hardware or software modifications are made].” They also conceded that there is no consumer feedback loop at present that might serve to validate the success of TotalStorage-Proven solutions enjoy in the field, nor is there any information available at present to confirm that the two-year-old program has delivered on its primary objective: “to support the sales of IBM (and partner) storage hardware.”

What TotalStorage Proven does provide are several TotalStorage Solution Partnership Centers, where IBM partners can go to test with IBM gear without charge. They credit the program as “an enormous value-add” for integrators like Saturn Business Systems in Manhattan, NY.

A senior account manager with Saturn is quick to point out that his SAN center has over $1 million dollars worth of gear—“a full service shop … all IBM and a couple of Intel servers.” He says that IBM’s Proven program is “an important source of storage interoperability and performance data, which is in short supply from vendors.”

From his perspective, “Storage is more a commodity every day, and customers usually buy everything from a single source. That’s why we are not a multi-vendor integrator, but an IBM Business Partner instead. The TotalStorage Proven program isn’t so much a guarantee as it is a solution set that should work in theory. We can test the theory by bringing in the customer and building out their configuration in our center.”

While Saturn’s customers may be interested in “one-stop-shop” solutions delivered by a single vendor and its posse, Zerowait president Mike Linett notes that most of his assignments involve “retrofits” of technology to existing heterogeneous infrastructure, where testing is an absolute must.

Many resellers and integrators are getting into the storage “proof-of-concept” business, Linett notes. He argues that while a 24-hour “burn in” of equipment has the value of spotting most component level problems, other purported benefits of most third-party performance testing labs tend to be specious.

Says Linett, “The Daytona 500 automobile race is not a real-world test of automobile usage. It is a test of how fast you can circle a track 500 times with no stop-and-go traffic. The same is true with a lot of the performance benchmarking offered by integrators and third party test labs. You don’t need performance testing as much as you need real world testing that emphasizes features with business value.”

Linett offers a practical example. “Say we are asked to test a solution involving a Network Appliance Filer and an Microsoft Server 2000 file server. Truth is, the Windows server is at least 10 percent faster than the NetApp equipment, but the Filer offers a snapshot feature that the Windows box doesn’t have—and that may be more important in the realities of day-to-day operations at the client site.”

DataCore’s Mark Freidman agrees. “Getting to real-world performance measurement is a straightforward process: you need to understand read/write activity and the block sizes and other characteristics of data produced by your applications, and—if you are traversing a TCP/IP network—you need to know the distances involved. The closer the test is to real workload, the more useful the test data will be. Truth be told, I don’t know too many resellers who are any good at doing this kind of testing.”

Freidman says that many reseller/integrators use testing to validate connectivity, but he finds their performance testing data is flawed when a customer informs him that “nobody ever looked at our workload.”

In the final analysis, testing is absolutely required to ensure that storage “solutions” will actually resemble the pretty picture in the vendor brochure. As a matter of best practice, you need to test storage products before you buy them, and under real workload to achieve real insights. Your thoughts? jtoigo@intnet.net

About the Author

Jon William Toigo is chairman of The Data Management Institute, the CEO of data management consulting and research firm Toigo Partners International, as well as a contributing editor to Enterprise Systems and its Storage Strategies columnist. Mr. Toigo is the author of 14 books, including Disaster Recovery Planning, 3rd Edition, and The Holy Grail of Network Storage Management, both from Prentice Hall.

Must Read Articles