Like so many other IT functions, data analytics is moving to the cloud. And as with other cloud-based endeavours, this presents both opportunities and challenges.
One of the top 10 data and analytics technology trends for 2021 cited by Gartner is the use of open, containerised analytics architectures that make analytics capabilities more composable. This enables enterprises to quickly create flexible, intelligent applications that help data analysts connect insights to actions, the research firm says.
“With the centre of data gravity moving to the cloud, composable data and analytics will become a more agile way to build analytics applications enabled by cloud marketplaces and low-code and no-code solutions,” Gartner notes.
The cloud can take data analytics to a new level for companies.
“Cloud enables the scalability we need for high-compute workloads,” says Aidan Taub, systems and technology director at creative services agency Loveurope and Partners (LEAP).
“As the world continues to digitise everything, organisations need to be able to build with file data at exponential scale,” Taub says. “When you have a massive amount of heavy unstructured data, like the videos, images, and audio we handle at LEAP, you never know how big the next job might be. Traditional analytics just doesn’t scale the way cloud does.”
Analytics in the cloud requires different approaches, skills, architectures, and economics compared with performing batch analysis in-house the traditional way, however. And with all this change, there are bound to be hurdles to overcome.
Here are some of the challenges organisations might face, and ways they can address them as they shift to performing data analytics in the cloud.
Fear of losing control — and the unknown
Data analytics is highly strategic for enterprises, and the idea of moving the analytics process to the cloud can be daunting for technology leaders accustomed to having complete control over such resources.
“One of the key challenges we see clients faced with is organisational inertia/fear of losing control,” says Anthony Abbattista, principal, Advanced Analytics Enablement Leader, at Deloitte Consulting, who has worked with numerous senior IT executives on shifting to cloud-based analytics.
“The traditional role of IT and the CIO has been to protect and be a guardian of data assets,” Abbattista says. To some, the cloud challenges the status quo because it can be quicker to market; for example, there’s more limited product selection and assessment, point-and-click provisioning, no need for large incremental capital expenditures, and so on, he says.
“Chief data officers and CIOs need to work together to vet and get comfortable with cloud platforms, so they can help derive business value and competitive advantage at least as quickly as their competitors,” Abbattista says. “This might require adoption of acceptable, proven, and emerging models in the market, rather than designing/architecting the analytics environment from the ground up.”
Many organisations are slow to explore new analytics capabilities, due to the inflexibility of their existing analytics processes, says Brandon Jones, CIO at insurance provider Worldwide Assurance for Employees of Public Agencies (WAEPA). “This results in fewer incentives and initiatives to try new capabilities and drive innovation,” he says.
To overcome this, the IT department at WAEPA used a cloud-enabled sandbox environment to establish a trial-and-error ideation process, using key performance indicators from key stakeholders and creating a prototype-first analytics environment.
Making the shift
Aside from overcoming the perceived loss of control, IT leaders need to deal with the actual move to the cloud and ensure no interruption of services.
“It’s intimidating. For many IT leaders, the hardest thing they’ll ever do is navigate the path to cloud,” Taub says. “But it doesn’t have to be that way if they choose the right solutions.”
When migrating data analytics to the cloud, IT leaders in many cases start with the “lift and shift” approach, by porting existing operations over to the cloud, Taub says. “Often this means re-tooling applications and systems to re-architect them for the cloud,” he adds.
As part of an overhaul of its legacy data infrastructure in 2019, LEAP migrated a massive amount of unstructured file data to the cloud using an analytics platform from Qumulo. LEAP’s file data was previously spread across a range of disparate legacy storage systems, and it was extremely labor intensive for data administrators to manage and locate data at different points in the workflow.
“Fortunately, Qumulo helped us shift all of our data without the need to refactor applications for the cloud,” Taub says. “I would recommend finding a [tool] that makes it simple to replicate and extract data across multiple environments.”
The shift enabled the company to optimise its data analytics and accelerate performance up to 240 times. Analytics enables the company to see how many clients are connected, who’s using the most bandwidth, and where the system is growing quickly.
“The success of our creative workflow heavily relies on our ability to access data analytics in the cloud,” Taub says. “We have a global network of hundreds of artists, designers, and motion graphics editors working remotely, so we need to leverage the cloud in order to efficiently and securely collaborate on creative projects. Without cloud-based data analytics, our production process would come to a halt.”
Acquiring the right skills
Successful IT endeavours seem to always come down to having in place the necessary skills. Moving analytics to the cloud is no exception.
Deloitte Consulting is seeing demand for skills begin to shift. “Rather than specialists to support each part of the technology stack in traditional analytics/BI [business intelligence], the cloud analytics environment requires more ‘full stack’ thinking,” Abbattista says.
“To address this challenge, the technology team supporting these new-age environments needs to understand the offerings on a cloud platform, adopt standard patterns, and then evolve [as] new techniques, tools, and offerings become available,” he adds.
Companies that choose to build their own analytics platform in a cloud environment or rely on vendor systems will need to have specific in-house technical expertise, says Josh Jewett, who recently left is his position as CIO of retailer Dollar Tree.
These include skills to create, maintain, and derive analytics from a data lake, and how best to employ cloud-native or third-party artificial intelligence and machine learning capabilities to draw additional insights from the environment, Jewett says.
“These drawbacks can be overcome through experienced partners and consultants,” Jewett says. The ideal arrangement is to gain experience from these outside experts so that once their contract is up the company possesses the knowledge and expertise to continue to evolve cloud-based analytics as needed, he says.
While at Dollar Tree, Jewett was instrumental in helping the company modernise many of its systems, including data analytics. “Like so many other retailers, my company adopted a hybrid strategy,” he says. It deployed software-as-a-service platforms to provide specific analytics capabilities for critical business functions.
“Examples of this included tools for inventory productivity, pricing optimisation, loss prevention mitigation, and talent acquisition and performance management,” Jewett says. The company also developed some of its own analytics applications in cloud environments to leverage the flexibility, scalability, and speed-to-market advantage of the cloud.
Securing the data
No matter how much cloud service providers emphasise the security of their infrastructures, many clients will always be concerned about how safe their data actually is in the cloud.
This is especially true with analytics, because the insights gained from analysing data can be a competitive differentiator. There is also worry about exposing highly sensitive data such as customer information.
Security is “top-of-mind any time you’re shifting your company’s valuable data out of a private data centre,” Taub says. “LEAP leverages a global network of workers that includes freelance talent, which means we had to make sure our data would be protected in a cloud accessed by internal and external users.”
One of the biggest security concerns is controlling access to cloud applications and data.
“The ease with which someone can use cloud applications opens up challenges, many of which are rooted in the fact that people can inadvertently create security, privacy, and economic concerns,” says Amy O’Connor, chief data and information officer at software provider Precisely.
“Hopping between cloud accounts and securely storing and exchanging keys becomes a security issue,” O’Connor says. “There needs to be strong governance around appropriate use of data. This is more urgent in the cloud than on-premises, because it’s so easy for people to copy data and use it in ways that are not authorised."
Precisely has a hybrid, multi-cloud model, leveraging multiple cloud vendors for its computing and storage needs.
“Our cloud-based data lake is where we store the bulk of our data,” O’Connor says. “If data originates in the cloud, we start processing there. If we have analytical needs that are bursty in nature, we utilise the cloud. If we need to start analytical processing quickly, we start with the cloud. When we need to process unstructured data and also when we use advanced analytical processing including machine learning, we leverage our cloud-based data lake.”
Avoiding a cloud money pit
Although using cloud services can help organisations avoid costs such as on-premises storage systems, expenses can quickly get out of control or come in higher than expected.
“Budget is always a concern,” Taub says. “One-size-fits-all data architecture can be an IT spending trap. When making the decision to move analytics to the cloud, enterprises often feel pressured to pay a high upfront cost and get locked into a long contract that doesn’t fit their current needs.”
The key is to find a provider that doesn’t force cloud lock-in. “When evaluating cloud platforms, don’t be afraid to shop around for the right solution that can address your current analytics needs, with the flexibility to scale up as needed for your future needs,” Taub says.
And while it’s easy to get started in the cloud, it’s also easy to move the wrong type of job to the cloud and leave cloud applications and resources running after they are no longer needed, O’Connor says.
Two of the most effective ways to control cloud costs are to take control of the way cloud accounts are created, and be completely transparent about who is consuming cloud resources, O’Connor says.
“To address the first point, we migrated all our cloud accounts under each provider into a single ‘master’ account,” O’Connor says. “We centralised who can create new cloud accounts. Individuals and groups needing new cloud resources go through a formal request process. The request must include business justification, department budget information, and business owner.”
As for transparency, whenever a request is approved, any new cloud account is created by the central team under the master account. “That governance policy allows us to provide transparency into the costs for which we get invoiced by our cloud providers,” O’Connor says. “Each account is created with the information provided in the request, and we can then use the cloud providers’ portals or consoles to monitor the spend that matches each initial request.”
Precisely leverages this spending information for an internal chargeback model, applying the cloud costs accrued to the requestor’s budget. IT uses these approaches “to drive accountability for cloud costs and to ensure we are appropriately spending for the right business reasons,” O’Connor says.