In-Depth

The Two Towers of Storage Pain

As the year draws to a close, the “Two Towers” of pain in storage administration—capacity provisioning and backup—remain largely unsolved, despite considerable activity in the industry around these problems. Will vendor efforts yield blockbuster solutions in 2003 or are they just rearranging deck chairs on the Titanic?

According to psychologists, the unprecedented success of the film adaptation of the "Lord of the Rings" trilogy bears testimony to a long-held tenet of Hollywood movie making: during periods of economic uncertainty, people often turn to fantasy for a temporary respite from their worries.

By all accounts, the current installment of the Tolkien epic, The Two Towers, delivers on its promise to provide a few hours of distraction for the small price of a ticket. Unfortunately, however, once the closing credits have rolled and the house lights come up, the problems of the real world remain.

And so it is with data storage administration. While recent marketing campaigns by some vendors have proven entertaining, their claims to have found solutions to the “Two Towers” of storage management pain—storage provisioning and data protection—do not hold up on close examination.

In the pale light of day, storage administrators are still plagued by the requirement to provision applications with the storage they require “the old fashioned way”—that is, manually. Doing so requires a complicated set of skills and knowledge about both the internal elements of the application itself and their storage requirements, and of the current disposition and rate of utilization of the various storage devices in their charge. The process can be aided through the use of automated tools—up to a point.

As previously reported in this column, 2002 saw several announcements by storage management software vendors that they were adopting an “application-centric” focus in their products to enable managers to see what storage resources were associated with each application in their environment. In other words, they were following the trail blazed three years ago by BMC Software, with its application-centric storage management (ACSM) initiative.

Veritas Software, Computer Associates, EMC, and others newly embracing application-centrism as the metaphor for storage management are still lagging behind old BMC in their implementation of the approach. It is not enough merely to relate storage to applications. The next step is to provide tools for manipulating datasets associated with an application to optimize the performance of the application itself. BMC does this for SQL and Oracle databases, for example, by providing additional software products for manipulating the locations of indices and datasets so that read/write contention on specific volumes can be minimized and input/output performance can be improved.

While it is likely that the newbies in application-centric management will eventually catch up to BMC’s lead in this area, once they do, they will likely discover that the Houston software company has jumped ahead of them once again. Given its recently announced relationship with Invio, BMC is well on its way to upping the ante once again in the realm of storage management by moving from inherently manual-provisioning management to process-oriented management. If process management is a new concept to you, think “operator procedures”—standardized playbooks used in the mainframe world to instruct the night operator in the actions that should be taken in response to specific events, messages, or alarms from the system.

Process management is Invio’s cup of tea. Combining their technology with the basic “view your storage by application” technology in BMC’s PATROL storage management framework, augmented with BMC’s steadily improving suite of rich application-specific management tools, paves the way for the centralized certification, management and control of local or remote storage management processes, including storage migration, topology management, provisioning, data replication management, and backup.

Basically, the system will “learn” the manual processes that are currently performed to provision and manage storage an application-by-application basis and create automated versions of those processes for use by operators when the situation requires. This is different from developing management task “wizards” a la Veritas SANPoint Control. Until storage automation is perfected, human intervention will still be required to perform certain tasks. Process management provides the best approach for integrating manual and automated tasks so that you don’t need a degree in rocket science to do the job. Were it not for process management in mainframe shops, every operator would need the skills of a systems programmer.

Process management is the next logical step in storage provisioning. BMC will likely get there first. I say this because most of the other vendors that I am tracking are chasing their proverbial tails trying to come up with a virtualization-based scheme that will enable on-the-fly modifications to storage volumes—their idea of storage provisioning automation. This approach is flawed for a number of reasons and will not bear fruit until we have developed a universal technology for carving and splicing LUNs across heterogeneous, proprietary storage platforms. I, for one, am not holding my breath in expectation of that solution any time soon.

In the realm of data protection, the other Tower of pain in storage management, vendor claims also fall somewhat short of reality. I can’t tell you how many shops I have visited over the past year that do not have a data backup strategy beyond prayer. I’m all for prayer, but I doubt that many of the IT professionals with whom I have spoken believe they will have one if their servers, which have never been successfully backed up, have an encounter with a smoke-and-rubble disaster.

In fact, 2002 will be remembered by many vendors of backup solutions as the boom year that wasn’t. Despite the lip service paid by organizations to disaster preparedness following the tragedy of 9/11, the much-anticipated surge in sales of data protection solutions never materialized. The market analysts failed to factor in the down economy and the tendency of business people to forget risks and exposures as events receded into the horizon in their mental rear view mirrors. Most business managers can think of a hundred things they would prefer to do with their money than to spend it on a data protection strategy that in the best of all possible circumstances would never need to be used.

Going forward, there are many technologies that may facilitate various aspects of data protection. I will cover one of these, cache-based file backup from Tacit Networks, in a future column. For now, the interim conclusion must be that data protection can no longer be conceived as a “bolt-on” to applications and their supporting IT infrastructure. Data protection requirements need to be considered at the time of application design and development and facilitated in software and platform architecture.

More on this later.

About the Author

Jon William Toigo is chairman of The Data Management Institute, the CEO of data management consulting and research firm Toigo Partners International, as well as a contributing editor to Enterprise Systems and its Storage Strategies columnist. Mr. Toigo is the author of 14 books, including Disaster Recovery Planning, 3rd Edition, and The Holy Grail of Network Storage Management, both from Prentice Hall.

Must Read Articles