Ask any five IT specialists what cloud computing is, and you're likely to get five different answers. That's partly because cloud computing is merely the latest, broadest development in a trend that's been growing for years.
Cloud computing is the most recent successor to grid computing, utility computing, virtualisation and clustering. Cloud computing overlaps those concepts but has its own meaning: the ability to connect to software and data on the Internet (the cloud) instead of on your hard drive or local network.
To do anything with a PC 10 years ago, you needed to buy and install software. Now, cloud computing allows users to access programs and resources across the internet as if they were on their own machines.
In the Beginning
First, there were mainframe computers, then minicomputers, PCs and servers. As computers became physically smaller and resources more distributed, problems sometimes arose when users needed more computing power.
IT pros tried clustering computers, allowing them to talk with one another and balance computing loads. Users didn't care which CPU ran their program, and cluster software managed everything. But clustering proved to be difficult and expensive.
In the early 1990s, the grid concept emerged: Users could connect to a network, much as they plugged into the electrical power grid, and use service on a metered-utility basis. Thus, people began speaking of utility computing .
One problem was where data was stored. Grid nodes could be located anywhere in the world, but there could be significant processing delays while data stored at other locations was transmitted.
Also, grid or cloud computing means users and businesses must migrate their applications and data to a third party or different platform. For enterprises with huge investments in existing software and operational procedures, this has been a real barrier to adoption of these shared technologies. Other significant concerns include data security and confidentiality.
Why It Works
Critical to the success of cloud computing has been the growth of virtualisation, allowing one computer to act as if it were another -- or many others. Server virtualisation lets clouds support more applications than traditional computing grids, hosting various kinds of middleware on virtual machines throughout the cloud.
Where It's Going
If cloud computing succeeds on a wide scale, it may well be because of recent initiatives from Amazon, IBM and Google.
In 2007, IBM and Google teamed up to provide the hardware, software and services needed to teach computer science students large-scale distributed computing. Their Academic Cluster Computing Initiative began when a Google software engineer, Christophe Bisciglia, wanted to improve computer science curricula by teaching college students how to solve problems involving massive computer clusters and terabytes of data.
Google's CEO recruited his counterpart at IBM to join the initiative. The two companies say they will dedicate hundreds of computers to it. Located in data centres at Google, IBM's Almaden Research Centre and the University of Washington, these resources are expected to eventually include more than 1,600 processors.
Initially, six universities -- the University of Washington, Stanford University, Carnegie Mellon University, MIT, the University of Maryland and the University of California, Berkeley -- are participating in the Google-IBM program.
Meanwhile, Amazon.com offers a couple of cloud services. Web service developers can use its Simple Storage Service (S3) to store any amount of data. And developers can use Amazon's Elastic Compute Cloud (EC2) to set up a virtual server in minutes, with none of the maintenance of buying and installing server hardware and software. Both services are offered on a pay-per-use basis.