In-Depth
Productizing Storage Services the IBM Way
How do you help businesses better manage information and increase the performance of their database and storage-related infrastructures?
It seems that just about every Big-Iron storage vendor is trying to get into the services business. There are several reasons for this move, of course. For one, a publicly traded services company can sell stock at 30 times earnings without Wall Street analysts batting an eyelid, while a storage hardware company is likely to get flamed for floating paper amounting to more than twice earnings.
In addition, with less than 5 percent annual growth in IT spending among traditional customers of Big Iron, hardware sales have been relatively flat in recent years. Services are where money is being spent by companies seeking to get the most bang from the bucks already spent on hardware or by those trying to forestall hiring more permanent staff for pressing IT projects.
For these reasons, and many others, services are a big deal. When EMC announced earnings shortfalls of 33 percent and layoffs of 4 percent of their staff last week, they sweetened the news by reasserting their goal to increase earnings from software and services. At Sun Forum last week in Las Vegas, services got the spotlight both on stage and off.
It is IBM, however, that currently leads the mass of hardware vendors heading down the services path. IBM has divested some of its hardware assets in recent years and secured through partnerships the equipment offerings it preferred not to build itself. Last year, Big Blue joined with Network Appliance Filers instead of building its own NAS boxes for enterprises. Instead of diversifying its hardware product lines, IBM has been laser focused on building its services business, going so far as to change its 100+ year self description with Dunn & Bradstreet from hardware manufacturer to software and services company.
Service is a challenging business, of course, especially when the size and complexity of an organization grows. This, according to Paul Fried, vice president of Storage and Data Services at IBM, explains the announcement the company is making today.
On its face, the announcement comes as no surprise. Quoting from the release, IBM is initiating a set of services and software “to help businesses better manage information and increase the performance of their database and storage-related infrastructures. Designed by information management and storage infrastructure experts, the new service products enable businesses to tackle information overload and surging data growth by smartly mining existing data, reducing data management complexity and better aligning storage infrastructures with the value of business information. The service products will be available from IBM Global Technology Services and represent the latest step in IBM's strategy to deliver traditional labor-based technology services in a manner similar to the delivery of technology products.”
Fried says that the newly announced productized service offerings (and there is a rather lengthy list of them) are part of a transformation he set out to accomplish at the beginning of the year. “In the past, we centered on building capabilities within a particular geography. It was difficult to take the great work that was being done locally and make it pervasive within the company. We want to build a global capability with consistent messages and offerings using consistent tools and where possible to build best practices and processes directly into software tools.”
Speaking frankly, he told me that the lack of uniform methodology, “implemented consistently in all engagements,” had the impact of “increasing engagement costs and decreasing productivity. It also increases risk if you do not use the same processes consistently.” In a nutshell, having each designer in each engagement “reinvent the wheel” carried with it a cost and risk profile that Fried sought to adjust to the benefit of IBM and its customers. Big Blue needed a mechanism for creating uniformity in its storage services, and today’s announcement helps to get them there.
Having learned the mainframe trade at the knee of IBM in my former life, I wondered why the huge knowledgebase amassed by IBM in both its customer work and in its own labs, and issued periodically in the form of Redbooks, was not sufficient to steer the work of project teams. There have been many occasions when I have consulted the methodologies and encyclopedic information repository in my own work as an IT practitioner and consultant.
“The Redbooks and other information created by the International Technical Support Organization (ITSO) organization within IBM are still very popular and widely read,” Fried said, “but within the global technology services organization, the ITSO process hasn’t been leveraged as well as it might have been. We are developing a new process for ITSO that will get information to service representatives faster and get them trained. It is a simple issue of taking what exists and scaling it. Imagine that you have a group of folks that use a process, and you suddenly quadruple the demand for the process. You need to do things to sale the process.”
He noted that IBM also offered internal forums called Communities of Practice. There are about 1000 IBM staff, including service providers, who routinely use the Storage Community of Practice to ask questions of peers and obtain guidance on comparable technical issues. This capability also needs to scale, according to Fried.
For now, the first steps involve building best practices, like so much business logic, directly into the software tools used by consultants to solve customer problems, Fried says. He notes that emphasis was being placed on tools and software to help automate the planning phase of the project. “IBM has focused a lot in the past on the assessment phase and the strategy and design phases of a project. We are working to automate the way that implementation plans are created and implemented.”
He claims that the tools are designed to support “real world” solutioneering (aka heterogeneous solutions that may include products not supplied by IBM). “Of course, we want to sell IBM stuff, but that’s not the real world. The first line on the first chart of my approach is to support heterogeneous infrastructure. Our tools do not simply select an IBM product for every problem.”
Will IBM make the tools available to customers as they had Redbooks and other materials in the past? He seemed to hedge in his response: “Our project accelerators are, to some extent, our intellectual property and our competitive advantage. What I do have right now is a vision of changing how we work with business partners and channels. We are thinking about setting up a Business Partner Advisory Council to promote not only infrastructure design but also information management. I envision this as a collaborative experience in which we can share some of our product offerings and methods and the partners can share theirs with us.”
One key to whether this service productization effort will ultimately bear fruit for IBM will be its flexibility in the face of different client needs, driven largely by the milieu in which the client operates. All banks are not alike. Moreover, the realities confronting a bank office in Kuala Lumpur are likely to be quite different from those that confront a sister office in Stuttgart or Boston, from access to WANs to available pools of qualified personnel. It has been demonstrated over and over again that “cookie cutter” approaches to problem solving do not work.
Fried conceded this point, “We are endeavoring to build an issue-based consulting and services business. This is different than product-based consulting, which is what you find in the industry generally. We want to take services to a new level and productize specific offerings around specific issues so that the service outcomes are more predictable.”
We will be watching as the services roll out. Your comments are welcome: jtoigo@toigopartners.com.