Everyone knows that server virtualisation shaves hardware clutter in the datacenter, boosts workloads, brings disaster recovery flexibility, slashes costs and basically saves the planet from nasty carbon emissions. But here's the dirty little secret: Many pitfalls await server virtualisation adopters, and a stumble can ruin all your virtual dreams.
The sheer number of potential missteps has Doug Dineley, executive editor of the InfoWorld Test Center, shaking his head. "Virtualisation offers irresistible benefits, and also the opportunity to drown."
It can be shocking to suddenly realise that your IT staff is woefully unprepared for virtualisation and needs training. Or maybe you'll stumble out of the gate, not knowing that it takes at least a month to get a grip on your server environment. You might be pressed to free up money to cover hidden costs or purchase new equipment -- yes, new servers will likely be needed for what's supposed to be a server consolidation project. Even if you navigate these and other pitfalls, you'll likely be blindsided by virtualisation vendors' over-the-top performance claims.
What's behind the virtualisation buzz
Server virtualisation breaks up the marriage of hardware and software (in this case, between the physical system and operating system software), and thus allows a single physical server to host many virtual servers running different operating systems. The benefits of this basic capability border on computing nirvana, not the least of which is server consolidation. For instance, IBM started moving the workload of its 3900 servers to 30 virtualised System z9 mainframes running Linux. Big Blue expects to cut energy consumption by 80 percent, or more than US$2 million in energy costs. Meanwhile, NetApp consolidated 343 servers to 177 via virtualisation and replaced 50 storage systems with 10 new ones.
Indeed, the front lines are awash with server virtualisation success stories -- and the drumbeat grows louder every day. EMC's virtualisation high-flyer unit, VMware, raised nearly $1 billion in its public offering last summer, based on a highly regarded product (see the Test Center review of VMware Infrastructure 3.0). Citrix Systems, which acquired server virtualisation vendor XenSource in December, took the wrappings off of XenServer 4.1 earlier this month. Last week, market researcher Gartner called virtualisation "the most important trend for servers through 2012."
Now Microsoft plans to shake up the virtual world with its Hyper-V, a virtual machine manager, or "hypervisor," the company is building into Windows Server 2008. Currently in beta and due out this summer, Hyper-V has already stirred debate among Test Center reviewers. Chief Technologist Tom Yager applauded the offering in February, while Paul Venezia panned Hyper-V in a Test Center preview two months earlier, citing, among other things, that his attempts to run the disk manager often resulted in a lockup. "It has a long way to go to be production-ready," Venezia wrote.
Further, Hyper-V will come to market lacking advanced features, such as live VM migration, that have long been present in VMware's enterprise offering. On the other hand, Hyper-V comes "free" as part of the operating system; and Microsoft's integration of virtual machine management into its pantheon of management tools is sure to be a hit with Windows shops.
Marketing buzz aside, the truth is that server virtualisation fundamentally changes the way a datacenter looks and feels -- and no major transformation comes easy.
Gotcha No. 1: You may not get the hardware savings you expect
One of the great ironies of server virtualisation is that many people expect the technology to save them boatloads of money from the outset when, in fact, it often costs them more. That's because server virtualisation demands two things: shared storage and some new servers that are powerful, richly configured, and equipped with hardware memory chips from the likes of AMD and Intel.
Even if you already have these souped-up servers, you're still not out of the woods. Server interoperability issues stymie many virtualisation journeys. "You can't mix AMD and Intel platforms together in the same [VMware] ESX cluster," says Chris Wolf, an analyst at the Burton Group. "You cannot move a virtual machine between them without restarting."
The same goes for a storage area network, or SAN. Not every SAN supports a virtualised environment. Also, existing network bandwidth may not be sufficient to handle the needs of a growing number of virtual servers. This means you'll likely end up spending money on new servers, switches, and other tech gear. Even worse, upgrade costs can offset nearly all the initial savings from decommissioned servers, says Matt Prigge, a consultant and Test Center contributor.
When the server virtualisation wave began to crest, industry watchers thought that the server market would be in a lot of trouble. After all, virtualisation allows people to consolidate many applications onto fewer servers -- preferably existing ones. And they were partly right: Gartner believes that virtualisation reduced the x86 server market by 4 percent in 2006.
But it soon became apparent, the Test Center's Dineley says, "that you needed to strictly standardise on hardware for your virtual farm." Thus the server market remains strong: Some 8 million servers were shipped worldwide last year, a 6.7 percent increase from the year prior, according to IDC.
Most people tackle hardware standardisation and server virtualisation slowly, usually when servers are due for retirement. They dabble in noncritical areas such as print servers before moving on to e-mail applications and enterprise databases. "It's a rolling-thunder approach," says John Humphreys, an IDC analyst. "We'll start to see the impact on [server] unit growth two, three, or four years down the road, as more people virtualise."
Gotcha No. 2: Getting the right staff experience is a challenge
IDG Research Services, a sister unit of InfoWorld, surveyed 464 participants late last year about their virtualisation experience. The biggest challenge? Forty-four percent of respondents said inadequate skills and training was the most difficult hurdle, followed by software licensing issues, performance and scalability challenges, and complexity.
So don't expect the IT staff to have all the answers to virtualisation from the get-go. It'll take at least a month to gain an accurate understanding of current server workloads, given weekly and monthly spikes, before deciding which servers can be virtualised. In small companies with only a handful of IT folks, you may need to hire -- surprise! -- a pricey consultant to conduct capacity planning.
A small company also may not have the necessary SAN expertise or, for instance, capability to mesh Cisco switches and VMware's complex virtual networking stack. "Virtualisation draws together so many different aspects of networking, server configuration, and storage configuration that it requires a well-seasoned jack-of-all-trades to implement successfully in a small environment," Prigge wrote in a "virtual" case study for the Test Center that's chock full of insight about challenges ranging from pricing and products to technical and skills requirements.
Larger companies don't have it easier, either. Getting a lot of people in disparate teams -- server, storage, business continuity, security -- on the same page is a feat, especially since they traditionally don't talk to each other very much. All of them, though, need to be educated about virtualisation. If there's a problem with an application, for instance, an administrator must know where virtual machines exist throughout the server farm so that he doesn't reboot a server and unwittingly take down all the virtual machines on it.
Gotcha No. 3: Performance boosts aren't always what they're cracked up to be
Despite the hard work, virtualisation adopters may feel a sting of disappointment. Many will have embraced server virtualisation with grand expectations, only to see performance fall short. Burton Group's Wolf points the finger at vendors: "For me, the way VMware advertises performance benchmarks is completely inaccurate."
The vendor publicity materials' virtual machine benchmarks involve running a single virtual machine on a single physical host. But a typical production environment is conservatively eight to 12 virtual machines per physical host. "This paints an overly optimistic picture of performance," Wolf adds. "They also tend to gloss over things like over-allocation of CPU cores" that can tax the hypervisor's CPU scheduler and lower performance.
Memory is another big performance-buster, Wolf says, especially with virtualising multithreaded applications. When separate threads within an operating system continually try to refresh memory, the hypervisor's shadow page tables get backed up. The result: latency. For applications that rely heavily on memory, latency spikes and application responsiveness deteriorates. Users may start seeing connection timeouts.
"Hardware-assisted memory is one solution, but it's also a crap shoot," Wolf says. "Some applications run better with hardware-assisted memory virtualisation while others run better with shadow page tables."
The fallout of lackluster performance can be huge. A company might have to fork out more cash for servers. Business execs may demand that applications be given their own servers again. "Restoring trust in virtualisation technology, it may take a couple of years before a company attempts to virtualise again," Wolf says.
And there are even more gotchas to avoid
Poor performance, unprepared staff, and hidden costs are only a sampling of the pitfalls in server-virtualisation adoption. Managing the whereabouts of virtual machines can be a nightmare, given that they can be moved from one physical server to another, or even walk out the door on a portable hard disk. Security risks abound, too. Audit failures due to the lack of full separation of security zones can happen more easily in a virtual environment.
And then there's the threat of virtual server sprawl; new applications are easy to get up and running in a virtual world. "Virtualisation increases your appetite for software," IDC's Humphreys says. "One company went from 1000 applications to almost 1300." Not only are there potential software licensing costs involved but also the task of tagging and tracking the applications.