Mobility has become increasingly important as smartphones and media tablets have matured, and cloud-based service delivery has added an important link. Increasingly, the mobile world has come to be dominated by Apple's iOS and Google's Android operating system, both on smartphones and tablets.
Stories by Brian Dooley
The datacentre is in the midst of an evolution toward consolidation, virtualisation and the cloud that will have widespread impact in the way IT services are consumed. What is up for grabs is how and where hardware is located and supported, to say nothing of how systems and services are sold. The good news for resellers is that while old markets are shutting down, new opportunities continue to emerge. It is more important than ever before to keep up with the changes that are going on in this critical area. IBM is involved with providing services from its own datacentres, as well as in building them for clients. “We have built a new datacentre at Highbrook [in Auckland], which serves as the foundation for provision of services to customers, and provides high availability, high reliability, green features, and scalability,” says integrated technology services manager, Paul Douglas. “We also help clients built their own datacentres, using our intellectual property and project management. The datacentre is the foundation, the integral cornerstone, for provision of cloud services.” “Cloud and virtualisation initiatives are dominating the datacentre space,” says Douglas. “As we move our own infrastructure from older installations to the Auckland datacentre, we ask what we can consolidate, and what we can virtualise. Virtualisation is a more cost efficient way of managing infrastructure and applications. However, it leads to greater density of computer resource, which in turn leads to greater power and cooling requirements. More complexity in the datacentre architecture also affects networking requirements. New technologies by Juniper and Cisco, such as QFabric and Nexus enable high speed connectivity between devices an servers. There have been significant change in the datacentre architecture. “ There are some interesting possibilities for local datacentres emerging from the ongoing evolution of cloud. Data sovereignty can be critical for customers, who may be asked to guarantee that data will be held onshore. It is also useful to note that New Zealand is recognised as one of the most trusted countries in the world, so can serve as a data hub. IBM has some discussions on possibilities of storing data in New Zealand for access in Australia and other locations. For resellers, opportunities are emerging in cloud and cloud-based applications. “We have a fairly significant roadmap of managed services rolling out through 2012,” says Douglas.” Resellers can take those services and integrate them into a client’s environment. One example is secure mobility, having workplace applications securely provided to bring your own device (BYOD) smartphones and tablets, to avoid putting the organisation at risk from insecure devices.” HP is another key player in the datacentre space. “After some years of minimal change datacentres are experiencing a surge of investment and replacement,” says chief technology officer David Eaton. “Most of this investment has focussed on upgrading the facilities of existing datacentres, or replacing existing datacentres with new facilities that offer higher degrees of resiliency and redundancy. The extent of investment required for this has also had an effect on the services that are offered. It is now rare to find simple hosting services offered in a Tier 3 datacentre. This is being replaced either by managed services on vendor or client-owned infrastructure, or by fully-managed and shared infrastructure using cloud computing.” Focus within datacentres has been on increasing technology to manage and monitor the infrastructure, so that the numbers of people required to run datacentres will continue to decline. “As organisations adapt their IT technologies to use cloud computing, IT technology will increasingly adopt Internet-style technologies,” says Eaton. “These technologies are resilient by design, and free IT from relying on hardware-based technologies that provide redundancy. Datacentres supporting resilient cloud computing applications can be separated by hundreds if not thousands of kilometres. Legacy application stacks tied to SAN-based redundancy will continue to require alternate datacentres within 20 to 50 kilometres of the primary centre.” Datacentre location, however, is not as flexible as the distance that can be physically spanned. “As cloud computing involves the sharing of services between clients, the location of the datacentre becomes more important to the vendor and the client,” says Eaton. “Client and vendor legal and regulatory responsibilities are important. As it is difficult or impossible for a New Zealand-based organisation to enforce New Zealand contract law or to protect the rights of data enshrined in the Privacy Act in another legal jurisdiction, the focus for New Zealand-based organisations will continue to be on using local datacentres and services.” Maxnet is a New Zealand owned ISP and datacentre services provider offering connectivity, hosting, online back-up and end-to-end virtualisation services to wholesale ICT providers and direct to businesses. “We operate a 200-rack high availability datacentre in our Albany, Auckland headquarters and a second 40-rack facility in Christchurch," says enterprise architect, Jeremy Nees. “Our Albany datacentre is the highest density commercially available datacentre in New Zealand. We have a strong focus on innovating in the datacentre space.“ For Maxnet, the key trends for datacentres are continual increase in power density and efficiency, as well as the virtualisation of systems . “We are now seeing a move away from hosting physical systems, and a focus on providing virtual storage and processing resources,” says Nees. “The other prominent trend emerging in this space is around smarter management of datacentres.” Cloud services are also advancing. “An increasing number of our customers are looking to use cloud services as ‘total solutions,’ as they see cloud computing as a viable alternative to owning equipment,” says Nees. “To get the most out of public cloud IaaS offerings, however, your systems really need to be designed to take advantage of the way those services operate. It is much easier to simply move private cloud services across to a cloud provider. Private cloud services also tend to provide a greater comfort level, with the recent Megaupload shutdown highlighting the risks of using a consumer-grade file upload service.” Maxnet sees the chief opportunity in this area for resellers in going where datacentre operators can't go, and using datacentre services to provide total solutions to their customers. Resellers can customise offerings and provide something a bit more unique or tailored to a customer’s requirements. “Assisting businesses with using local cloud services presents a fantastic opportunity,” says Nees. “There are a lot of people very interested in cloud services; they understand why they should be using them, but are struggling with the ‘how’ and ‘where’ when looking to progress further. Having a trusted partner guide them through this process, and then continuing to support them into the future, is definitely in demand. Resellers are crucial to ensuring the end customer is realising the best value from the service being delivered. This is all about spending the time to understand their true business requirements and adding value to various services to really make them a perfect fit. “ All Maxnet services can be resold through approved channel partners Oracle provides an integrated portfolio of servers, storage, software, and networking products. “These products have been engineered to work together to create what we believe is the next inflection point in datacentre economics," says senior sales manager, Mark Raos. “Customers typically spend 70 percent of their IT budgets integrating and running disparate pools of technology designed in isolation of each other. Oracle takes a much wider view of the datacentre, integrating all the way from the application to the disk. “ Today’s datacentre is migrating from a physical, static, and heterogeneous set-up, to a grid-based virtualised infrastructure that enables self service, policy-based resource management, and capacity planning. “With the continued evolution of the cloud computing model, Oracle’s view is that datacentres will move to a POD (integrated racks of server, storage, network and software) design optimised for service delivery rather than technology function,” says Raos. “With these designed on open standards interfaces, customers can then choose how they leverage datacentres across public and private infrastructure. “ Organisations are continuing to consolidate and virtualise their datacentres. “Big data is another major technology trend affecting IT infrastructure,” says Raos. “There is a potential treasure trove of non-traditional, less structured data: weblogs, social media, email, sensors, and photographs that can be mined for useful information. Decreases in the cost of both storage and compute power have made it feasible to collect this data. Many businesses have been caught off guard by the boom in ‘big data’ and are reacting to the situation with a short term increase in outsourced datacentre and cloud service use, while planning longer term to build their own in-house datacentre facilities.” In Australia and New Zealand, possibly the single biggest driver of datacentre consolidation for large organisations is cost containment, which is even more prevalent in Australia with the upcoming introduction of the carbon tax. Growth in the scope and power of the datacentre has significantly increased its energy consumption. IT teams are monitoring server and storage utilisation and must consider ways to raise utilisation levels so that IT hardware does not sit idle while consuming power and driving up ventilation and cooling costs. “Organisations have realised that their datacentres still have dedicated silos, where each application runs on its own middleware, database, servers and storage,” says Raos. “Each silo is sized for peak load and therefore there is a significant amount of excess capacity built in. Each silo is also different, leading to complexity and high costs to manage. Organisations are today moving from these silos to a grid or virtual environment with shared services, dynamic provisioning and standardised configurations or appliances. The majority of the organisations I talk to are engaged in some form of consolidation, though they may be doing this in only a portion of their datacentre.“ Many organisations will further evolve to a self-service private cloud offering the same flexibility and incremental cost advantages to end users as public clouds, but with less perceived risk and greater assurances of security and accountability. “In the meantime, we believe public clouds will continue to mature and eventually create a potent mix of private and public clouds, a hybrid cloud, which will run a single application, managed in a federated manner, through a single pane of glass. “Oracle President Mark Hurd is moving to simultaneously expand both Oracle’s channel sales and direct sales coverage by 25 percent,” says Raos. “He says the focus for partners will be on the hard-to-reach midmarket and SMB organisations. The new pay-as-you-grow software licensing flexibility for the Oracle Database Appliance is one of the many examples of how Oracle is adapting its channel programs.”
Over the past several years, backup strategies have matured, converged, and centralised. Companies are aware of the need, and willing to implement protection. However, implementation needs to be efficient; the backup strategy needs to be targeted to the company, its data, and its line of business; and the media needs to be secure. Strategies such as storage networks and hierarchical data storage have migrated down from the enterprise, and the myriad of disk mirroring and virtual server solutions have created almost limitless possibilities for moulding a backup solution to ensure business continuity at an acceptable price. For the reseller, understanding this market and customer’s requirements is crucial. Customers are seeking guidance, and need to find the best possible solution for their business. The key to this market is keeping abreast of the growing range of technology aimed at the business continuity sector. For every customer, there is an optimal solution. “Symantec offers data protection to the SME market through to large enterprise customers,” says systems engineering manager, Paul Lancaster. “We have proven data protection technologies, and have been a leader in protecting heterogeneous environments for more than 20 years.” Backup Exec is Symantec’s flagship product for small and mid-sized businesses. Veritas Netbackup Puredisk Remote Office Edition offers storage and bandwidth optimised data protection for remote offices. For enterprise-level users, Symantec has introduced key advances into Netbackup 6.0 and has a comprehensive upgrade process aligned with tools to ensure customers plan properly. Netbackup is designed to aid in compliance, performance and data centralisation. “Crucial to Symantec's backup solutions is the ability to deploy, manage, secure and protect our customers' critical data,” says Lancaster. “The centralised, managed protection of data is essential, combined with reducing storage requirements for disk backups using low bandwidth data synchronisation over a WAN [wide area network]. The range of available options for users includes the ability to restore their own data via our Puredisk product right through to being able to rebuild their environment through our Backup Exec system recovery product. These solutions are growing as businesses move away from tape to disk-based backups.” As data or remote offices grow, customers can dynamically add storage to their environment, and Symantec’s Puredisk product can be used to automatically redistribute content to improve load balancing and performance. Reducing tape media, storage, and network resources makes it possible to enforce enterprise data management and compliance policies. “Tape is entrenched with most enterprise customers, but adoption of disk-based solutions is strong,” says Lancaster. “Enterprises will shift more of their backups to disk but tape will remain a part of the majority of enterprises data protection strategies for the future.” Robust email essential for data protection Symantec has just completed an SME roadshow on email management, protection and security. The presentation focused on avoiding risk, strategy and leveraging the benefits of a healthy, robust email system. “Resellers have the ability to highlight the customers' requirements and need for effective data protection technology in the remote office,” says Lancaster. “Another key area is data duplication technology, which reduces disk capacity required for backup and uses bandwidth more efficiently for remote sites. There is also an ever increasing opportunity to simplify the disaster recovery and the long-term management of backup data.” Quantum offers a number of backup and recovery solutions that range from $400 tape drives to a floor-standing enterprise tape libraries and a broad range of backup to disk solutions. “In the New Zealand market, our key volume sales come from autoloaders, and rack-mounted tape libraries, along with one to 10 terabyte virtual tape libraries,” says country manager, Craig Tamlin. “Backup is being driven by massively expanding data volumes (in excess of 50 percent per annum for typical clients); mass adoption of SANs (the more disk you have the more you fill it – then you have to back it up), and, to a point, all the attention given to regulatory compliance. People are more aware now than before of data retention and recovery requirements,” Tamlin says. Quantum's most notable recent advance is the release of de-duplication technology. This offers the potential for sites to retain weeks of backup copies on disk, with many recovery points. On a terabyte disk array, clients can store 10 to 20 – sometimes, even 50 terabytes of backup, depending on how it is used. “Tape's low cost of storage will ensure it stays forever,” says Tamlin, “but its role is slowly changing as new, better and more cost-effective backup to disk solutions appear – such as de-duplication. Tape will play more of an archival role in the future.” Ask customers the backup question For the reseller, Tamlin suggests a targeted approach. “Ask the backup question. When the client says ‘Oh, I've got an old tape backup unit,’ respond ‘OK, but will it cope with your new backup workload? Let me help you to size the right solution.’ Then call us.” “Backup solutions are becoming more cost effective and are quicker,” says Adaptec country manager Demetri Christodoulou. “Disk-to-disk-to-tape (D2D2T) provides near-line storage and offline backup as well. “Backup for workstations and laptops systems is getting more important and critical, because they contain important business data but are often neglected in the backup plan. Easy to use software like Snap Storeassure offers automatic and continuous data backup for desktops and laptops back to the main data storage system online, without affecting the workstation’s performance.” On the enterprise side Christodoulou has seen an increase in enterprises integrating NAS into their existing network, because it offers easier implementation and management at a lower price point compared to SAN products. “Tape is still strong in archiving but we are seeing a lot more pick up on disk,” he says. “More are using D2D2T solutions that provide shorter backup time and use tape for archiving purpose. This is also driven by the cost decline per gigabyte of SATA disk.” Resellers should sell more backup solutions and educate clients on proper backup solutions that are suitable to the clients’ business. “More evangelism on the importance of backup is also needed,“ says Christodoulou. More users now sit on terabytes of data “Backup solutions are great, but when the crunch comes it’s recovery that is important to customers,” says CA senior consultant, Cory Grant. CA provides recovery management solutions for small to medium and enterprise businesses. “We cover from traditional tape-based recovery to newer technology around D2D2T with our CA Arcserve products and now with CA Xosoft true continuous data protection (CDP),” says Grant. “We are and have been seeing a definite shift in the marketplace from managing backup to recovery management, as the data protection needs of business today far exceed what traditional backup and restore systems can provide. In particular, these traditional systems can't fulfil escalating requirements for recovery and continuity – requirements that are being driven by many factors including the increasing cost of downtime, disaster response failures, the growing importance of information to the business, and the increasingly distributed nature of IT operations.” New Zealand’s storage market is maturing, and CA is seeing more and more clients with terabytes of disk sitting in the data centre. Enterprises want to make efficient use of the disk they have and make sure it is easy to backup and recover the vast amounts of storage that exist. Customers are still buying more disks than ever, and they are increasingly moving to 24-hour-day operations. This makes it important to provide a recovery management solution that does not impact day to day operations but still protects these large amounts of data. “Tape is still our best and most efficient method of long-term archive,” says Grant. “However restorations on a regular basis should be from high-speed, low-cost disk arrays, or even better from the CA Xosoft GUI which has more roll-back points. If you know companies that are relying on tape-based recovery only, you should be talking to them now about moving up to a more mature offering and using tape as the last resort.” Resellers should also be asking their customers if they know where their data is, says Grant. "It is probably on a laptop or on a server; but more and more data can now be found on USB keys, iPods, phones and so forth. Now ask the second question: ‘Is it protected?’." One alternative to standard backup is to outsource, if infrastructure requirements get too complex. Revera (formerly HdS) provides just such a service. "The case for outsourcing backup is irresistible," says marketing manager Roger Cockayne. "IP bandwidth is sufficiently cheap and backup and restore applications have improved, particularly in compressing data and putting data owners in control of their data. Practical and economic barriers to accessing tier one backup infrastructure have all but disappeared. There's simply no good reason for companies to invest in their own backup infrastructure – working it is painful and expensive. Just because backup is business critical doesn¹t mean you can¹t let it go. Most businesses cannot achieve the robustness or cost efficiency of an independent computing infrastructure provider," Cockayne says. Revera has just been appointed as New Zealand distributor for data backup software provider Attix5. Cockayne claims the combination of Attix5 backup software and Revera's Virtual Data Centre computing infrastructure platform will provide an “industrial-strength” service at an SME price.
It really is mandatory to take on board the expression “everything old is new again” when considering the technologies of virtualisation. Despite recent media interest, this concept goes back to the early days of mainframe computing, where time-sharing environments provided “virtual machine” terminal sessions for all users.
What Telecom's unbundling means
Nominations close on August 3
WEDNESDAY, JULY 29 | How partners can help their customers make the most of Cloud as the future of work takes hold
EDGE 2020 Goes Virtual