In-Depth

Why Data Protection Budgets are in Trouble

Data protection budgets are in trouble because executives are getting the wrong pitch from storage pros.

Last week’s column looked at survey findings from Sepaton that painted a rosy picture of the state of disaster recovery planning, data protection and (unsurprisingly given Sepaton’s expertise), de-duplication. Interest in and spending on these efforts and products was projected to be largely consistent with 2008: that is, steady or increasing.

Responses to that column, though conflicting, underscore the state of corporate disaster preparedness. On the one hand, a good friend serving for many years as continuity planner for a prominent retail office supply company reported that he and his small staff had received pink slips from the front office. He hoped his group would stay under the cost-cutting radar, keeping their budget requests small and endeavoring to “build in” continuity by making it an important consideration in the development of new business processes and a key criterion for tech purchases initiated by the company.

Apparently, this business-savvy approach was not enough. Curbing expenditures was important and every little bit helped: -- especially when the expense represented to senior management an investment in a capability that in the best of possible circumstances doesn’t need to be used. Disaster recovery is often perceived this way.

The continuity planner’s sad report was reinforced by other trade press accounts last week of cost cutting that impacts data protection spending. At least two weekly publications suggested that data protection/disaster recovery were high on the list for corporate budget cuts in 2009, implying that such endeavors lack a compelling business value case, especially to companies that don’t have operations in hurricane-prone areas of the country -- or recent memories of cataclysmic natural or manmade disasters.

In contrast, I had the pleasure of speaking before a group of resellers/integrators late last week who were reporting that business was going gangbusters. They indicated that purchases of data storage and data protection solutions were exceeding all sales goals for the quarter and February receipts would more than make up for the usual holiday-season slowdown .

Of course, these people were talking about integrated solutions mixing data storage, data archiving, and data protection components. They were all in sync with the message I was pitching on stage: the need to rethink product-oriented selling and to work on business value-focused solutions that contain costs, reduce risk, and provide top-line growth to customers.

Disaster recovery, focused only on risk reduction, doesn’t make the same kind of business value case as does a more comprehensive data management pitch (of which data protection is a part). By offering a corporate customer a combination of technology and strategy that emphasizes cost-containment, compliance, continuity (often a part of compliance), and a smaller carbon footprint (usually conceived as part of cost containment), a data management initiative has a much better chance of meeting with front office approval than does offering a more simplistic DR-only solution.

Chatting with this group provided a glimmer of hope and renewed enthusiasm about the future of data protection, not only because the event took place in sunny Clearwater, FL, but because it was fashioned around the sales and marketing efforts of Xiotech, whose deconstructionist storage mentality and standards-based management approach stand in stark contrast to just about every other storage player in the industry today.

Where the EMCs, NetApps, IBMs, and HPs are adding more ”value-add” functionality to a storage stovepipe array -- often impeding its ability to be managed in concert with competing products from other vendors and almost always driving up acquisition and licensing costs to obscene levels -- Xiotech is developing an ecosystem of third-party vendors who can interoperate with their Emprise 5000 storage building block (itself a highly resilient array) and participate in a common management paradigm based on open standards from the W3C.

Overall, my message to the audience was that the solution integrator needed to be put back up on the pedestal, combining best-of-breed functionality and products together to custom fit storage architecture to customer needs. From the standpoint of data protection, this involved the selection of best-of-breed software and hardware components that could deliver services efficiently to the data that required them. In contrast to the embedded-systems mentality that dominates most of the array-maker world, Xiotech was encouraging solution providers to sport a more network-oriented mindset by placing functionality where it best serves the customer, not the vendor.

Some functions, such as de-duplication engines or virtual tape libraries (VTLs), don’t need to be delivered with their own cabinets of overpriced disk drives. The drives already deployed can be used to host these functions in the network on appliances, routers or servers. Many de-dupe vendors claim they cobble together the tin and spinning rust with their software because that is what their consumers prefer: a pre-integrated box. That may be true in smaller shops with limited labor assets, but enterprise consumers should know better.

They should know, for example, that placing de-duplication on a gateway in a network provides a port multiplier benefit that drives down cost. More data paths can be routed through a gateway than through a dedicated stovepipe, which in turn means that you don’t need as many stovepipe platforms.

Placing functionality where it belongs is often in conflict with the vendor community’s desire to appear as though they are delivering more value in their product every quarter by adding features and functions to the array controller. They have set the expectations of consumers for continuous “value” enhancement and must appear to be delivering more value -- usually in the form of more embedded software functionality -- on each generation of their array products.

One obvious outcome is that more data is being placed at risk. I will explain why -- and what you should do about it -- next week.

Your comments are welcome: jtoigo@toigopartners.com.

Must Read Articles