It really is mandatory to take on board the expression “everything old is new again” when considering the technologies of virtualisation. Despite recent media interest, this concept goes back to the early days of mainframe computing, where time-sharing environments provided “virtual machine” terminal sessions for all users.
Today, many large organisations are viewing the diaspora of computing from the central IT environment with something like regret. Despite the constraints of the bad old days, centralisation has a lot of virtues – particularly in areas such as flexibility in deployment, management and security. As computing moved out into the organisation, it also fragmented into numerous increasingly powerful workstations, and numerous dedicated processors –servers – linked together on a network. Some of these servers handled storage.
Centralising the storage components by putting them on a high-speed, fibre optic LAN – a SAN – was the first act of virtualisation, permitting disks to be virtualised and assigned according to need. But, if storage, why not the servers themselves? So, servers were virtualised, providing flexibility and central management – and requiring fewer hardware units. Finally, if servers, why not workstations? And so we arrive back at the beginning, only with a considerably more complex environment, a lot more processing, and much stiffer requirements. Virtualisation is now being driven by increased security and compliance requirements, as well as by a continued movement of the technology down the scale from enterprise to small business. For resellers, it offers the chance to take a look at clients’ infrastructure, to see where efficiencies might be added through consolidation. Virtualisation in hardware “In the last three to four years we have seen a fundamental shift in how small to medium businesses need to use IT resources,” says Adrian Deluca, solutions engineer, Hitachi Data Systems. “Organisations can’t live without email systems, outages can't go above a few hours, and there is a need to ensure infrastructure is well aligned to applications, providing ongoing protection and performance. Organisations are finding that their ability to adapt to changing business requirements in short time frames is not only difficult, but risky and costly as well. Many need to address scenarios such as rapid growth in customer base or an increase in the number of transactions, but this is only one part of it. Most have trouble keeping up with the day to day issues like providing adequate data protection for growing applications, performing routine data migrations or meeting new compliance or guidelines handed down in their industry. What may seem to be small changes to management, may translate into huge changes in the IT environment. “Unfortunately, many organisations face this challenge by tactically buying more equipment, throwing more hardware and software at the problem and hoping it will go away. Although it may provide a short-term reprieve, chances are in the long-term you are worse off. Virtualisation tries to address this by making the IT infrastructure more fluid, ensuring IT resources are more flexible and adaptive from the outset," Deluca says. Virtualisation can provide a number of advantages for an organisation, including: • Management simplification, through a consistent interface and process for managing and provisioning all storage resources through common, storage-management tools.
• Cost reduction, by eliminating the stranded capacity problem associated with non-virtualised arrays.
• Performance improvement, by striping and caching across more storage resources.
• Risk reduction, by simplifying the steps in redeploying capacity when the needs of an application change. “I think most organisations today understand the virtualisation and the high level business benefits it can bring,” says Deluca. “The confusion is more around how they can make elements of it work for their particular business requirements and environment. Many fail to do the basics like assessing the Recovery Time Objectives and Recovery Point Objectives that are expected from their applications, uncover repetitive tasks that can be simplified or identifying inefficient use of storage resources in their environment.”
Critical scalability In a recent survey of IT users, IDC asked the top features they expected to see from storage virtualisation solutions. Not surprisingly, scalability was the number one feature they were looking for – since many of the organisations evaluating virtualisation had experienced a dramatic increase in the demand for storage resources and knew the of the challenges of scaling their IT environment horizontally. Being relatively savvy storage users, they were also likely to have an investment in multiple storage resources already. So the ability to provide heterogeneous volume management and data migration means they can re-purpose what they already have. “Although elements of virtualisation have been around for some time, the fact is, all this technology is difficult to put together in the context of servicing the needs of an application,” says Hitachi's Deluca. “The IT resources used to deliver the service typically have no idea about the needs of the application, and vice versa. Therefore we rely on administrators to weave it all together. But most organisations today simply can’t afford to bring applications down, not even for an hour. Furthermore, they just don’t have the people and resources to deal with the day-to-day running of their IT environment. Business applications are 24x7 today, their infrastructure needs to be the same. “ Hitachi Data Systems saw the need for virtualisation technology in storage early on, and began research and development in the late '90’s. The technology allowed multiple, heterogeneous operating systems to be connected in a SAN and share a single fibre channel port controller without disruption to the other hosts sharing it. This saved customers the cost of having to buy storage arrays with many ports to support each operating system they had. As opposed to the two types of network-based virtualisation (inband and out of band), Hitachi Data Systems has chosen to implement virtualisation inside the array. “Taking an early position in virtualisation has really paid dividends for Hitachi, having captured technology leadership and significant market share,” says Deluca. “IT outsourcing organisations are particularly interested in virtualisation to help them consolidate their storage pools, manage them with ease and react quickly to customers' needs. Resellers should really start by understanding their customers’ major business challenges and what impact they are having on their IT resources and ability to service the business effectively.” Virtualisation in software “Simply put, server virtualisation enables organisations to maximise server assets while delivering application services to the business,” says Symantec group manager, Sean Derrington. “Symantec enables IT organisations to effectively and efficiently manage both the physical and virtual environments. IT organisations are taking an infrastructure approach to server and storage management and are looking for a consistent and standard way to manage the entire server/application portfolio. The use of virtualisation technology will also offer a major step forward in enterprise security. With the security threat landscape in a typical enterprise changing on a daily basis, security vendors must develop more innovative ways to protect desktop endpoints. Evolutionary security enhancements have just managed to keep pace with threats, but it is clear that more revolutionary security models, including those that use virtualisation, will be needed to secure the desktop in the future.”
Symantec offers an industry leading application clustering solution. At the recent VMWorld trade show, the company announced the extension of this capability for VMware ESX 3.0 implementations. Veritas Cluster Server by Symantec will be able to provide IT organisations the ability to provide application availability in a physical and virtual environment.
“There is some confusion between server virtualisation and storage virtualisation,” says Derrington. “Moreover, there are unique implementations of server virtualisation, and consequently, server vendors' virtualisation solutions will vary. Symantec provides enterprise class storage and application management for both the physical (single operating system instance per server) and virtual (many operating system instances per server) for Unix, Linux and Windows platforms.”
Symantec is now working with Intel to build security solutions for Intel's new vPro virtualisation technology that will allow IT managers to manage security threats outside the main PC operating system. In this isolated, virtual environment, embedded within the vPro technology, security solutions will be more tamper resistant and "always on", monitoring and protecting the desktop.
“There is significant uptake within New Zealand as organisations want to maximise the server infrastructure,” says Derrington. “Additionally, virtualisation is being used more in production environments thus increasing the availability requirements for applications.” Virtualisation as service “At Unisys, we have been providing services in our data centres rather than running customer equipment,” says managed services manager Shane Gaskin. “We take ownership of the infrastructure and leverage that across multiple clients. So some of the services that we offer, such as storage-on-demand take advantage of our capability to procure equipment and to provide disk to multiple clients on an as-required basis For us, virtualisation provides advantages in cost and efficiency through better utilisation of resources. “ Through server virtualisation technologies, Unisys is able to provide greater flexibility and lower cost to its clients. “The key advantage lies in ensuring that we make better user of our IT resources,” says Gaskin. “We are also hoping to move to the next level, which is to carve up processing capacity and let customers use more or less of that as needed. "Ultimately, this is moving toward a utility model of IT provisioning. We also want to reduce the deployment time. These technologies provide the capability to deploy the resources to the customer on an as required basis – all in a matter of minutes rather than going through a lengthy acquisition and implementation process.“ There has been considerable marketplace confusion over virtualisation. One key issue is that virtualisation is not in itself a service, but is, rather, an enabler to perform services in a much more effective way. Key components are around network, storage and compute resources and their management. “Virtualisation is moving organisations toward a more centralised and more easily managed approach to computing,” says Gaskin. “This is a basic approach that harks back to the time-sharing mainframe, but is better supported by new technologies. One development is the increased usage of virtual desktop environments using thin clients.“
Virtualisation on the desktop
“Server consolidation was the start of the current interest in virtualisation, reducing amount of servers in the data centre, says Wyse Technology regional sales manager Ward Nash. “Server consolidation has morphed into virtualisation, which is a software way of reducing servers by providing virtual sessions. One server can provide different operating systems and different sessions.
"Virtualisation is about reducing hardware and administration costs in managing hardware and applications. If you can centralise everything in a data centre, though, why not reduce desktop support costs as well. This is where Wyse comes in. Desktop support costs are even higher than server support costs. We're trying to reduce the complexity of the desktop, and we can do this through shared services [thin clients], streaming, and virtualisation.”
Wyse sees virtualisation as removing the software from the hardware. Users do not need to be concerned with hardware specifics – they just want to get to the desktop environment. Wyse is indifferent to the underlying VM solution, and can work with Cyrix, VMWare or Sun, providing thin client devices.