For the Australian market especially, the opening of Microsoft Azure in Melbourne and Sydney triggered a switch in most companies’ minds, according to Purdy.
“They now believe the public cloud is a viable option for enterprise workloads,” he claims.
“That’s not quite the case.
“Some older applications just aren’t suitable for public cloud platforms due to the way public clouds deal with failure of services – i.e. they rely on the application to deal with the failure whereas traditional applications have relied on underlying infrastructure (like VMware/HyperV) to maintain service.”
But Purdy acknowledges that it’s clear that the public cloud providers are starting to understand and are beginning to introduce services that address this challenge.
Amazon Web Services (AWS) announced recently that it has introduced ‘HA’ and Google already supports live migration for maintenance programs so that applications aren’t disrupted if the underlying hardware is under repair.
“Hyperscale providers (AWS, Azure and Google) are only going to step up their services to enable enterprise workload support, which will speed up the move to use hyperscalers for core IT systems,” Purdy adds.
Moving onto the topic of Public Cloud, Purdy believes spending surpassing traditional data centres more quickly than expected doesn’t mean that en masse people are going to move to Public Cloud services.
“There are customers who will continue to believe that the risk and the cost to move aren’t be justified, or that their business model doesn’t fit with a Cloud/Operational expenditure,” he adds.
“On-premise just won’t go away – those workloads aren’t moving to public cloud anytime soon.”
So for at least the next five years Purdy believes the industry is living in a hybrid world.
“But clearly the traditional vendors such as VMware, IBM, HP, Microsoft will be touting that their public cloud revenue to meet the expectations of Wall Street,” he adds.
By Rob Purdy - Director of Cloud and Enterprise Tools, Datacom Australia