In-Depth

Storage 2008: Promises Delivered And Postponed

Promises comes and go. A few have morphed into something different.

Looking over the past year in storage, there has been the usual assortment of promises made by vendors for delivery by year’s end. Some have been delivered, others have been postponed, and a few have morphed into something different altogether. This week’s column is the first of several in which I will examine the evolutionary, and in some cases revolutionary, changes in storage that will be on our minds as the calendar year winds to a close.

Let’s start with storage resource management. An effort to standardize management, using the Storage Management Initiative-Standard (SMI-S) from SNIA, which the Storage Networking Industry Association (SNIA) proclaimed a huge success last December, has gone nowhere. With the SNIA beginning to fray at the seams, once politically incorrect criticism of the approach, which involves the integration of a software component called a “provider” on all storage components that reports status and avails access to configuration controls via a management console, is rampant. Even a few of the vendors that were involved in the development of the standard now state that it is no longer part of their product road maps.

However, with improved storage task automation and management growing in importance in the minds of IT architects, especially in those companies in which IT operations staff are being pared back in response to budgetary constraints, the failure of SMI-S has encouraged innovation in other areas. One trend I see is the deconstruction of storage resource management (SRM) as an unwieldy and difficult-to-deploy software suite and the integration of SRM components into other products such as backup software.

Deconstructing SRM

This week at CA World in Las Vegas, I saw a demonstration of the next generation of ARCserve, the company’s flagship data protection product, which includes a number of functions previously available only in CA’s BrightStor SRM package. CA has been steadily growing the feature set of its tape backup software, which has been poorly followed in the press this year mainly because of the long marketing shadow cast by its cousin in CA’s data protection and disaster recovery product family, CA XOsoft.

XOsoft, the WAN-based data replication and failover software product, has garnered much discussion in the trade publications (including this column) because of its visible role in continuing mission-critical applications for businesses impacted by hurricanes and other major disasters in 2008. However, ARCserve and other tape backup products are the real workhorses of DR. Cogent estimates place 70 percent of the world’s data on tape backups and, for most companies, tape is still the most affordable and reliable mechanism for data protection.

CA developers have done an outstanding job of adding value to ARCserve, which today sports its own virtual tape system to redress the problems of tape backup windows. The perennial problem of tape has been the difficulty involved in completing backup jobs within operational windows. The difference between actual backup time requirements and estimated times presented by backup software has been a source of ire for many users.

The problem is actually caused by the way that backup software works: the backup software aggregates data sets from different repositories into a single superstream of data and forecasts job completion times based on a simplistic calculation using the total volume of data divided by the nominal write rate of the target tape device. As the backup of smaller data sets completes, the superstream begins to unravel, data is no longer introduced to the tape drive at an optimal rate, and the tape drive begins to shoeshine -- repositioning tape on the read/write head and delaying the completion of the work.

Introducing a virtual tape system into the mix can help address this problem. ARCserve enables the user to designate a disk for the initial copy of the data, then performs backups from the staged data copy, perfecting the superstream, so to speak. This nifty bit of innovation makes backup window problems yesterday’s news. The innovative thinking doesn’t stop there.

With SRM a tough sell in most companies, the technical developers in CA have elected to kill two birds with one stone. Since ARCserve (like most tape backup systems) already provides the capability to inventory servers and storage platforms, functions that SRM also performs, the next evolution of the product performs these tasks at a more granular level, enabling the visualization of storage associations with servers and applications, including virtual machines in a VMware environment, and providing reports that can be useful, not only in optimizing backups, but storage as well.

Complementing this value-add feature is the soon-to-be-released capability to perform de-duplication of backup data sets that have been written to the ARCserve VTS disk, a capability that may well end up usurping the value proposition of de-duplication appliance vendors.

Deconstructing SRM into functional utilities is not new. The stage was set by the hardware vendors themselves, who have been adding software to their arrays to differentiate them from competitor's gear for the past decade. Doing this in software rather than hardware prevents the lock-ins and expense that go hand-in-hand with hardware plays.

SaaS

Software as a service (SaaS) has also been a much-hyped trend in the tech sector this year. With regard to storage, it has usually equated to on-line backup, a technique in which data is copied or backed up to a remote target environment via the Internet.

Results of this strategy have been mixed. The initial backup process is typically painfully slow and bandwidth intensive. Subsequent backups usually include only change data and are more transparent in terms of system performance and bandwidth utilization.

However, a vulnerability of the approach was revealed after hurricanes of the past two years. Expedited restore of backed up data can become problematic if many users in the same geographical area request the service at the same time. Although there were no reports of delays as lengthy as those that some users confronted following Hurricane Katrina (delays of up to 100 days following the storm were reported), there were problems with some on-line backup services.

Still, this hasn’t dimmed the enthusiasm of the SaaS model. A big announcement at CA World was the creation of a service based on CA XOsoft: customers who prefer not to deploy their own site-replication and failover strategy can work with a CA service provider to identify always-on applications and their data and to deploy the necessary agents to replicate data and failover to a data center host as needed.

Starting at about $400 per month, this service has wheels as an expedient implementation option for continuity of mission critical applications and builds on a thus far stellar report card for the XOsoft product. In discussions with CA executives at the show, I suggested that they make the service available for a free trial to companies during the hurricane season in 2009. I am reasonably certain that companies using the software service for any length of time will probably sign on for an on-going relationship.

One minor complaint voiced by a CA exec: an attendee at the show told him that he used a 30-day free trial of the XOsoft product to facilitate application migration into a VMware environment -- using failover as the migration method! It was unclear whether the CA manager was more put off by the failure of the person to license the product following his migration project or CA’s failure to anticipate this potential use for the product, which aims primarily at disaster recovery and business continuity.

More promises made and postponed in the next installment. Your feedback is welcome: jtoigo@toigopartners.com.

Must Read Articles