From software-defined datacentres (SDDC) to an ever-increasing focus on sustainability and energy efficiency, the datacentre market in Australia is thriving and undergoing massive change thanks to game-changing technologies including Cloud computing. A group of select attendees gathered at an exclusive roundtable lunch to discuss datacentre predictions for 2014 – with a key focus on the SDDC promise and vision. The roundtable also touched on the angst over a deskilled reseller market, the worry surrounding a skills shortage, and challenges associated with preparing customers for the transformative technological journey in the datacentre. JENNIFER O’BRIEN reports.
Jennifer O’Brien (JO): How has the datacentre industry evolved? What are some noticeable developments?
John Donovan (JD), VMware: We’ve been working for a long time on helping customers and the partner community understand what the Cloud is for them and architect what a hybrid Cloud looks like. Also on educating them on how to use their existing resources and then be able to move those workloads in a secure fashion into and out of the datacentres. The evolution of the datacentre, the software defined datacentre and the software defined networking and network virtualisation layers are incredibly important to us. This is what we do – we architect the next layer of what this technology looks like. We’ve all got a responsibility to the industry to help describe what this is, what it does and why it’s meaningful to partners and customers, rather than just being hot technology. We should focus on the agility and the cost controls and how it makes things easier to do.
Damien Spillane (DS), Digital Realty: The evolution that we’ve seen in the past three to four years in terms of the workloads and the type of deployments customers are putting into datacentres has changed phenomenally. The changes over the last five years from iPhone apps to Big Data and the next horizon, which is the Internet of everything or M2M, have driven an extraordinary amount of data requirements so, in turn, storage requirements have grown massively. The density of the storage arrays is growing at an astronomical rate. The efficiency of the storage from a gigabyte perspective, and from a cost and space perspective, has grown significantly. And the requirement in terms of kilowatts is still growing. The demand for storage is one of the real trends that we’re seeing in the datacentre space. There’s also a greater awareness of efficiency. The other aspect that we’ve seen evolve is security. As more data has been put into the Cloud – personal, business, legal, and commercial data – organisations are becoming much more aware of the efficiency. The requirements around physical and operational certifications, particularly from a security perspective, is an area that we’re seeing growing – and there are some other evolutions to happen. The final trend is, of course, conductivity. With all of this data that is now residing and all of the compute that’s residing in these large datacentres, the conductivity between them and the size of the pipes, the diversity of the pipes and, of course, the cost of the pipes is a major factor and is changing. The requirement for conductivity is growing.
Allan King (AK), Infront Systems: From the evolution side, it’s at the operational level that we’re seeing the major shift. We’ve trended with the marketing rhetoric over the last three to four years around business demanding more from IT. But where the pressures are coming from internally within the business is not necessarily aligned to the marketing. It’s not the business demanding more, it’s IT demanding more from IT that is spawning our ability to address IT’s requirements and spawning out this shadow IT market.
Our observations, particularly over the last six to 12 months, have been that the application development team is demanding more from the operational team in terms of agility. Their ability to develop and be ‘a time to market’ process is what the business expects. And traditionally we’ve just been very slow at an operational level to adjust to that. With every major customer we’re engaged with at the moment, we’re engaged to meet the application development requirements for agility. Where we are unable to do that in a timely manner is where an organisation spins up the Cloud service. They are literally taking their credit card and spinning up a test development environment to identify and meet business requirements. That’s a challenge for IT and it’s a big evolutionary step from what we’ve always done in decades past. We’ve been very static and very slow.
The other thing to address is the exponential growth in the data with which we manage. We’ve been pretty lazy in a lot of ways from our governance frameworks to ensure that we are right-sizing our environment. We’re consuming a lot of capacity but not necessarily using it well. The biggest trend for us is automation. It’s a top down vision. But the first thing we articulate to a customer is that the strategy is built bottom up. Automation requires orchestration; it requires tight integration.
Peter Hewett (PH), Westcon: We’re in the middle of an evolution in datacentres thanks to what’s happening with software defined datacentres and networking. It’s a very exciting time and we have a great deal to learn as a distributor; we all have a great deal to learn about what’s going to happen next. It’s going to be a very exciting time over the next two to five years.
JO: What is the definition and power of the SDDC?
Aaron Steppat (AS), VMware: It is a common infrastructure that guarantees the service levels to the business for what makes their business tick. It’s what gives them their competitive edge. It’s what helps them realise time to value and gets their time to market down. And those are the key things that a SDDC should actually and will actually provide.
In the datacentre of old, you had a lot of physical elements and, as part of our strategy, we came along from a historic perspective and said, ‘well, they’re going to abstract first and foremost that thing called the CPU.’ And by moving that into a software construct we realised so many benefits, and many organisations have realised a ton of savings on the power side, and realised efficiencies they never thought were possible. We’ve been doing that now for 15 years, on average, and it’s had a significant impact. To take that approach of what we did to the CPU and to extend it to other elements of the datacentre is what SDDC is all about.
An abstraction is that first key element to getting to that point. It’s not just the abtraction of compute, it’s also the network. The next shiny object on that space is network virtualisation – the traditional term of software defined networking where you’re doing exactly the same thing. You’re taking that traditional physical rigid inflexible construct and you’re moving it into software. Because as soon as you move it into software you get agility, you get that scale, but you get that control of scale. We don’t just stop at the network - we also move into storage. Having this whole premise of software defined storage and bringing the ability to get high availability and high-performance but from a consolidated and highly virtualised infrastructure is another key step in that journey to get to the SDDC.
JO: How is the SDDC solving the challenges associated with the traditional datacentre silos?
AS : One of key problems in datacentres today is siloed IT, where you have a complete silo for one application and you’ve got a complete silo for a different pipe. You run out of capacity in one; you can’t just pick it up from another and give it to that to solve your service level issue. Being able to take those disparate resources across many different towers, put them into a large bucket and re-carve them out based on the service levels the business requires is the next key element to be able to realise the success of the software defined datacentre. Last but not least is to automate it to drive that efficiency. And you won’t realise efficiencies around everything from right sizing to getting visibility into critical applications to being able to consolidate even further and to get that flow on effect around power saving and real estate savings, without automating it. Automation is an opex play: It’s how you actually drive efficiency into the way your people manage their datacentres.
JO: Are customers skilled up in the SDDC arena, and are we seeing real-world implementations of it?
AS: We’re on a journey. Our customers have been on this journey for some time and our partners have been joining us. We have customers today who have done the compute side and now they’re really getting involved in the network and the storage piece. From our perspective, we’re seeing a lot of the uptake and success is the management and automation layer.
The next biggest uptake we see is around visibility. How do I know that this application actually needs the eight CPUs and the 64 gig of RAM that I gave it? Chances are it doesn’t. If I can give that back to my infrastructure and use it for an application that does genuinely require it – this is just one of those capabilities that we see in our management piece that gives you that visibility to be able to do that. So we see a lot of success there.
JO: Are resellers skilling up in the SDDC in this area?
JD: What we’re doing is we’re fundamentally talking about how we rearchitect the entire infrastructure of the datacentre again. Granted, when you start looking at the virtualisation, and the network layer, that’s a complex task. Fundamentally datacentres have been built on very specific construct. And we’re looking to create a much more agile approach to how that’s put together. And that can be a complex thing and it can be quite a challenging thing as well. So we’re under no misapprehensions about the level of enablement that will need to occur throughout the partner community. Because one thing that we keep very near and dear to our hearts is ensuring that the partner community is front and centre in delivering the boom of this technology.
In just about every single case the partner community’s been responsible for the architecture and development and the building, implementation, the management to these datacentres with their customers. We need them to be at the forefront of how we start to transform this infrastructure as well. We always call this sort of stuff the journey, and it absolutely is. It extends the journey that we kicked off around infrastructure, virtualisation, the creation of private and public and hybrid Clouds into now the software defined datacentre virtualising additional layers, that’s going to require an ongoing effort from VMware and other vendors insuring that partners remain front and centre and have the appropriate skills to do it. This will take a while.
JO: Is a new type of partner coming to the table?
JD: Any partner that has a good understanding of the intellectual property of the customers that they’re trying to sell to or manage can call themselves what they want. I just need to insure that they have the appropriate skill levels. Historically, software companies have tried to engage partnerships with organisations where they become automatic extensions of their own sales force, their own technical force. Meaning whatever we say goes and you have to pony up and do exactly the same thing and here are all the competencies you have to do. We’ve moved beyond that. We understand that it’s a little bit more complex and a little bit more nuanced. Certain partners will have skills in some areas and other partners will have skills in different areas. It becomes a much more community approach to how we architect and solve some of these problems. I don’t think any software/hardware vendor trying to force their own view of the world down the throat of a partner is going to be terribly successful. As vendors, we might compete on certain technologies but we’ve all got a responsibility to present a common view about how this is going to work. It becomes more and more important.
Wayne Neich (WN), Nutanix: The important thing to focus on is, what is it that businesses are trying to achieve? Clearly, there’s a total disillusionment at board level about what IT is doing and has done for years. It is a very expensive monster; it’s like owning your own dinosaur. They cost a lot of money to feed. They cost a lot of money to maintain. The people that are at that board level are looking for how they can commoditise the delivery of IT as a service. To be able to commoditise requires the simplicity, the visibility. It is about being able to see all these core resource infrastructure elements in a single pane of glass, including storage, virtualisation, networking, and compute. If you’re going to commoditise it, then you have to commoditise the skills around it. You have to take the complexity out. You’re ultimately trying to reduce the cost of it and be able to scale it.
PH: The partner’s value is that you are a CIO for hire. You’ve got all that experience and you can provide that advice with good solid grounding. And in the two years I’ve been at Westcon I’ve seen a number of our partners evolve to that, and it’s great to see. We’ve still got partners out there that are shifting iron. And they need to evolve. They need to do the thinking and do the math and work out how to help the customer to see that Cloud can be a good thing, but you need to really understand what it costs to make the transition and what it is going to cost you and your business.”
JO: Does datacentre efficiency remain a key market driver for Australian businesses?
Jason Rylands (JR), DPSA: One of the big things to come along recently is a legislation called Nabers, (National Australian Built Environment Rating Systems), which is essentially a star rating of buildings, an energy efficiency rating, which helps deal with the intense power that server rooms take up and addresses the inefficiency of datacentres. In general, it measures the energy efficiencies, water usage, waste management and indoor environment quality of a building or tenancy and its impact on the environment. The legislation is causing people to look at their servers and datacentres and question whether they need particular equipment in the commercial office space, or is there a better place for it to actually live? As a result, we see a lot of the ‘build or buy’ argument going on.
DS: Nabers will be a big disruptor in the next two to three years.
JO: Is there teeth in the legislation?
DS: Not currently. At the moment it’s voluntary. But when you see how it evolved in the commercial office, it started out there being voluntary. Within two to three years it became mandatory. Currently, around 80 per cent of the Australian office stock is Nabers rated. And government is just a starting point. Enterprise organisations, along with anybody who’s got any aspect of sustainability are prescribing a certain Nabers rating. And that will happen initially with datacentres. I don’t see any reason why it won’t grow to the Cloud. When you’re looking at deploying a Cloud application, you will also need to understand how efficient the energy use that’s producing the work in that Cloud is. And that is a real factor. There is a perception about the ‘Cloud touching the air’ so that it doesn’t consume energy, that it is floating around. And even one of the advantages of the SDDC is driving that efficiency. But again, by abstracting it you take it away from the reality that it’s powered, it’s plugged in, and it’s using power. As the price of energy, as the impact of carbon becomes more prevalent and, of course, water and all the other sustainability aspects, the holistic energy, the efficiency and sustainability of IT, in general, will become a major factor and will evolve.
JO: How aware, and receptive, are customers about the SDDC?
Matt Zwolenski (MZ), EMC: Two things are driving our strategy: the software defined datacentre and Big Data. Customers want to know what it means. How do we get there? Are there products and solutions available today? What will things look like in the next three to five years? As I talk to CIOs and CFOs I ask them, ‘Why haven’t you adopted the software defined datacentre yet? What’s the biggest impediment to why you’re not there today?’ The answer is never the technology. It’s never the products or solutions. The answers are usually: ‘My organisational structure doesn’t support it; my culture in my organisation doesn’t support it or something in my process is holding me back from delivering it.’ We’ve done a lot of work in consulting in terms of IT transformation for organisations to help set them up so they can provide internal consumption models back to the business to charge the business as Opex.
JO: What is the view of the software defined datacentre from the distribution layer?
Ean Mackney (EM), Westcon Group: I’m watching the vendors have a much a larger conversation around getting back in touch with their alliance partners and it’s around the applications and what you can provide. And this is probably going to be the steppingstone and the building blocks to what will become the software defined network. Because I know we’re talking about it as if it’s the here and now, and it is, but I’d be interested to hear what everybody’s thoughts are on how far away we are we think this is from going live. Because I don’t think this is something that’s going to happen today. This could be five, could be 10 years and we’re all going to end up there. I have every confidence in that. And obviously in the way that we consume and the way that we offer that to the partners is going to be the really interesting piece.
MZ: In terms of going live though, I bet if you go into major Cloud providers you’ll see that they don’t have separate pools of computer storage.
EM: Absolutely not. What I see more often than not at the moment is more like a Platform-as-a-Service.(PaaS). And a lot of the smaller more niche guys who have been positioning themselves as a Platform-as-a-Service. They are actually really great at SAP. They have Oracle skills. These are guys that are suddenly finding themselves moving up into a Software-as-a-Service conversation because they didn’t think that that was where they were meant to sit, but they realise that unless they have the skills in house and the ability to offer that then they are going to be dead in the water. So the next question there is what are the offerings that the vendors have? Obviously with the initial server consolidation around a long time ago, we’re all there for it. That was realistically the first software defined network. That was it for us in this recent addition. The only difference now is we’re including the internet of Everything in between where we’re trying to go and charge that out. We are figuring out how to do it, right now.
The partner challenge --------------------------------------------
Infront Systems’ managing director, Allan King, said there are a number of challenges facing partners as customers transform to a world with Cloud computing and the journey towards converged infrastructure. But none so dire, he warned, as the monster issue lurking in the closet: the general deskilling of the partner.
“We have had a general deskilling of our industry over the last ten years, which is now making it incredibly difficult, from a partner community, to be able to spin up or require those skills needed to provide those complex orchestrated services.
King said the general deskilling is wreaking havoc across the partner community. “You try and hire someone today that hasn’t been in the industry longer than 15 years and say, ‘I need you to write a script.’ They say, ‘what do you mean?’ Microsoft has done a wonderful job in providing commodity-based services. What that’s done to the broader community is create a generalist in IT.
“But now you’re asking us to be able to have that single click model over complex services through an automated fashion that requires deep understanding of all the elements to get somebody to understand how it all comes together, and the guy can’t write a script? The guy doesn’t know a programming language? He knows how to hit a ‘next, next, next’ button.”
WhiteGold Solutions commercial director, Leigh Howard, said complexity brings opportunity for resellers, particularly crucial now that Generation Y’s are demanding simple, manageable IT solutions and implementations, even in the datacentre.
“I don’t know why we have a skill shortage in the IT industry because we absolutely need those skills. Because isn’t it the role of the reseller or the integrator to translate complexity into simplicity for the end user?
“The end user, unfortunately, like it or not, is becoming the Gen Y of today. They’re the influencers, they’re the decision-makers. They’ve grown up in a world where they can go ‘next, next, next’ and it just works for them. They’ve got a smart phone that they’ve grown up with. They were on an iPad at the age of three and four, playing games and swiping and tapping. They don’t want complexity,” he said.
“Their expectation of the datacentre environment, whether running an enterprise software portfolio, the demand is it should be simple, and it probably should be and we are probably moving towards that. It is the channels role to translate that complexity. The vendors are bringing out some brilliant software to do that, but you need the skills to translate that into something that is not complex because the customer has a right to demand that they can just swipe tap, splat, whatever they do.”