Q&A: Getting Virtualization Right

How can IT determine what should be virtualized, then know when the right mix has been achieved?

Virtualization can be tricky. IT has to determine what needs to be virtualized, then know when you've achieved the optimal benefits from the technology. In the process, IT makes many mistakes, which we unmask, and explain what a proper "virtualization budget" looks like. To learn more about getting virtualization right, we spoke with Stephen Elliot, vice president of strategy for CA's Infrastructure Management and Data Center Automation business unit. Elliot is focused on key areas such as business unit technology, strategy creation, analyst relations, market positioning, partner development, and customer deals.

Enterprise Systems: How can IT determine what needs to be virtualized for best performance? What are the key considerations in this process?

Stephen Elliot There are a few key considerations. First, not every application should be virtualized at this point as resource requirements vary per application type. For the most mission-critical applications that require high resource dedication and I/O throughput, these should remain on dedicated hardware, at least for now. However, it is likely that upwards of 75 percent of current workloads can be virtualized across mainframe and distributed environments. As vendors continue to work on reducing the performance inhibitors of virtualized applications, the discussion for which applications to virtualize will increase. Let's not forget that mainframe virtualization has been around for more than 20 years, and this platform continues to grow. In fact, many customers are looking at mission-critical applications and considering if they should be moved to "virtualized mainframes." The bottom line is that IT must consider their performance requirements, management needs, business and technology risks, capacity planning expertise, and process standardization to determine what are the best applications to virtualize.

Today, the most common applications virtualized beyond file/print are Exchange, departmental SQL, and Oracle databases. Applications such as SAP are on the rise, but there is much hesitancy from IT at the moment.

Among the many pieces that have to be considered (CPU, memory, disk space, etc.), are all equal? Is there some hierarchy which you should consider first -- which gives you the most bang for the buck?

Virtualization drives up the need for storage. As such, IT executives must consider the type and architecture of their storage environment. This impacts server, storage, and desktop virtualization strategies. This is one of the most important topics to consider. Another is the type of application you want to virtualize, as well as the amount of risk the IT organization is willing to take. The opportunity to utilize automation has never been greater; IT must consider this as an efficiency strategy that impacts business outcomes.

How do you know when you have it "right"? In other words, how do you know when you have the right balance and that you've achieved the optimal benefits from virtualization? Is it simply a matter of trial and error, or are there best practices IT can follow?

Best practices are formulating quickly. One example is the new ratio of VM-to-admin or VM-to-host. Typically we have seen the VM to admin ratio in the range of 100 to 1. On the VM-to-host, this averages about 10 to 1, but depends on the resource allocation and type of applications that are being virtualized. Another consideration is ITIL. We are seeing a drive to adopt ITILv2 processes across the virtualized infrastructure, notably problem, change, and configuration management processes. Finally, the integration of physical and virtual infrastructure is a requirement; management integration and a consolidated view make a huge difference in cost reduction opportunities.

What does a "good" virtualization plan look like?

It depends on what we are talking about as it relates to virtualization. There are many flavors such as storage virt, server virt, desktop virt, etc. I think there are key tenants such as the need for capacity planning, recognizing the fact that virtualization is not the best technology for hosting all applications, management budget and processes, staff commitment, and (assuring the ROI is met) that goes beyond hardware consolidation and power and cooling numbers. The key is that automation and the integration of physical and virtual environments must be executed. Management must be considered up-front, especially as it relates to existing management solutions and process investments.

What common mistakes does IT make when undertaking a virtualization project?

The typical mistakes that IT organization make are the following:

  • Lack of planning: Many IT organizations do not plan for the impact on the existing staff and how they will adjust to the new technology and the agility that it offers
  • Capacity planning: The inability of IT to really think through the best way to decide how much capacity and resources are required to deployed virtualization
  • Cross silo adoption: Some IT organizations do not consider who should "own" virtualization. Should new titles emerge? Is there a central cross silo team?
  • Management: The need to manage VMs does not go away; most management challenges in the physical world are exacerbated because they are now virtual and more difficult to view, discover, and manage
  • Budgets: Many IT organizations don't budget enough money for the best management processes and solutions

What mistakes does IT make in maximizing the utilization within a virtual environment?

Again, a recurring theme is poor planning. Weak capacity planning, lack of proper budgeting for platform tools versus heterogeneous management solutions, and product training to maximize ROI are all "mistakes" that can be avoided with good planning. Another is the idea that management gets easier. In fact, it often gets much harder as the VM deployment expands. This creates a huge risk for IT and the business.

Once you've virtualized your environment, what's the next step to getting the most out of the technology?

Proper management and automation. Organizations should plan for automating tasks across physical and virtual environments. Also, one thing you should never forget -- make sure the CIO buys in to the decisions you're making. Perhaps I'm stating the obvious, but it's critical just the same.

How does a server virtualization solution help manage the environment? What are its key components?

Server virtualization makes the infrastructure more dynamic and real time. It can drive automated tasks and drive more efficient operations, but without management it does not matter if it is physical or virtual. The same challenges exist for both platforms; there really is no difference in the need to manage people, process, and technology. It offers the opportunity to dynamically allocate, in real time, compute resource based on business demand. But to execute this, IT needs an automation platform to pull it all together.

How is managing a virtualized single-platform environment different from managing a heterogeneous (multi-platform) environment?

In a nutshell, it is about the service versus looking at a point in time integration. It's safe to say that heterogeneous VM deployments are not a question of if but when. Platform management tools are important, but they don't deliver the end-to-end service management that integrated tools can. As such, it's important to understand what you have and which vendor has VM support, across VMware, Microsoft, Citrix, and others.

How can organizations budget properly? What does a proper "virtualization budget" look like?

There is no easy answer to this. I would say look at it from how much do you think you can save, and then how do you make this translate into business value implications; what's the business outcome? It goes beyond cost savings to include the business process impact, higher margin opportunities, faster time to market, improved demand management, more effective staff usage, etc. Many customers have hit a tipping point and are now concerned that the initial business case is at risk of blowing up as cost numbers continue to grow, and management processes and solutions are not deployed. It starts with the business case for the technology, and isolating how to manage and deploy the technology. The net of this is that it's not just about virtualization being another architecture that must be managed. It's about the business service and how IT can more effectively deliver the services that generate revenues, meet customer demands, and drive growth.

Must Read Articles