In-Depth

Beyond Utility Computing: Gartner's Vision for Data Center Integration

Automation is the real key to real-time infrastructure, and today’s infrastructure is anything but real time. The solution: what Gartner analysts call business process fusion

Surveying an enterprise landscape in which too few systems are flexible enough to adapt to changes in real time, Gartner analysts pronounced the status quo not so good at their Symposium/ITxpo 2003 conference last month in Orlando, Florida. The solution? Researchers described a technology vision that centers on a dynamic, two-way relationship between IT and business processes, which Gartner calls business process fusion.

The problem is especially acute in the data center, where, Gartner analysts say, most systems are not only underutilized but lack the flexibility to respond to changing business requirements in anything approaching real-time.

“Automation is the real key to real-time infrastructure, and today’s infrastructure is anything but real time,” says Donna Scott, a vice president and distinguished analyst with Gartner. “Today, if you want to … take an application server from a trading application and repurpose it to a financial reporting application, it would often take more than eight hours to configure and deploy those changes.”

That’s simply not going to cut it, Scott says. “With a real-time infrastructure, changes occur dynamically and automatically across the components, so rather than [requiring] eight hours, it occurs in minutes,” she comments.

Hewlett-Packard Co., IBM Corp., Sun Microsystems Inc., and other vendors have championed competing utility computing visions as strategies to increase the utilization of systems, improve efficiencies, and automate business processes. Utility computing describes an infrastructure in which compute resources — servers, storage, and even networks — are virtualized and can be apportioned according to capacity requirements or changing business needs.

Although proponents have typically emphasized the cost savings associated with the utility computing paradigm, in which customers essentially pay only for the capacity that they use, Gartner’s Scott says that the real-time infrastructure endorsed by her describes something bigger than this. “[The] real-time infrastructure is not just about saving money. It’s about making your infrastructure more agile, more responsive to changes in business priorities, and business demands, and also to enable IT resources to dynamically meet a service level.”

Nevertheless, acknowledges Gartner vice president and research director Tom Bittman, utility computing is a step in the right direction. “[A real-time infrastructure] starts with making the infrastructure virtualized,” he confirmed. “Virtualization connects and pools IT resources so they can be managed at a higher level logically. For example, logically moving resources, or parts of resources, from one service to another without physically moving equipment. So virtualization really is the foundation of automation.”

Ironically, Gartner researchers say, a real-time infrastructure describes a configuration remarkably similar to the early days of the mainframe, when a single system often supported all of an enterprise’s workloads. The simplicity of the mainframe approach eventually gave way to a model in which IT organizations deployed new systems whenever it needed to introduce new applications. Gartner’s Scott says that this had two effects. “First, the majority of applications are small, and require less capacity than the servers they sit on,” she points out. “And second, systems have to be sized for peak loads, plus any variability, plus anticipated growth. So the servers are bigger than they really need to be, that’s hugely inefficient in terms of resource usage.”

Until recently, enterprises have employed server consolidation strategies to reverse the effects of this practice. Although some skeptics have questioned the necessity of server consolidation or utility computing strategies at a time when hardware prices are still in freefall, Scott suggests that they’re not seeing the whole picture. “Maybe it would be true if data center costs were just a matter of hardware costs, but a large part of data center costs today is made up of labor,” she observes. “Computing platforms only 25 percent utilized still require 100 percent administration.”

Elsewhere, Gartner analysts identified three technology triggers that they say are required to achieve business process fusion: Systems integration, information unification, and application usability.

Daryl Plummer, group vice president and research general manager for software infrastructure with Gartner, says that the first component, systems integration, isn’t just about connecting systems together. “We’re talking integration with a broad scope, the scope to encompass both the full end-to-end processes and also to link between different types of processes,” he explains. “Most system integration as we know it to date has really focused on transactional processes and data. For fusion, that needs to be extended to other types of systems,” for example, Plummer says, systems that handle documents, calendars, and e-mail. Similarly, information integration isn’t just about aggregating or otherwise pooling information. “[It] requires not just that we link the systems, but also that we provide ways to begin to link the different types of information,” he comments, noting that portals have been “very helpful” to this end: “The portal has provided a framework to view a wide variety of types of information all together.”

Then there’s application usability, which is the most radical component of what Gartner calls business process fusion. “That just means applications that are easy to change. In this case, we’re looking specifically to changes in software related to changes in business processes,” says Gartner research director Mark Raskino. “Applications have always, of course, been linked to processes, but that linkage has been written directly into program code.”

In the application usability model, says Scott, application ownership shifts from enterprise developers or IT managers to the person owns the relevant business process. “We’re looking for the ability to make process definitions explicit, and be able to modify them directly. Now it’s the business process owner that gets control,” she comments.

The growing popularity of service-oriented architectures (SOA) provide one example of this, and Scott cautions that such architectures require a new development paradigm, which Gartner calls service oriented development of applications — or SODA. “[SODA] is the practical impact of these ideas as far as the development team is concerned,” she explains. “It focuses people a great deal on becoming process-centric.”

About the Author

Stephen Swoyer is a Nashville, TN-based freelance journalist who writes about technology.

Must Read Articles