Menu
​INSIGHT: Top 3 ways to keep big data from becoming big downtime

​INSIGHT: Top 3 ways to keep big data from becoming big downtime

The rise of the Internet of Things (IoT) and the resulting data is providing businesses with enormous opportunities to gain insights like never before.

The rise of the Internet of Things (IoT) and the resulting data is providing businesses with enormous opportunities to gain insights like never before.

Organisations must make sure their technology infrastructure is able to handle the load that big data puts on existing systems.

By 2020, Gartner expects that there will be 25 billion connected ‘things’ in use.

The data produced by this rapidly growing collection of connected everyday devices can deliver significant value to business and in everyday life.

Businesses increasingly understand analytics technology required to make sense of big data.

However, the network and infrastructure requirements needed to handle big data are often overlooked, along with the additional security, scalability, and visibility potential needed.

“There’s no point having the latest data analytics platform if the data can’t reach its destination securely,” says Stephen Urquhart, general manager A/NZ, Ixia.

“Investment in big data analysis is at risk of becoming a big loss for businesses if they don’t make sure the rest of their infrastructure is designed to handle the volume of data involved in big data analysis.

“Network infrastructure, security, and visibility must be top priorities for businesses that want to get the most out of the large data sets provided by connected devices.

“One of the best ways to make sure that all the systems within an organisation’s technology footprint can handle big data is to test it to breaking point before going live with a big data analytics program.”

Urquhart says there are three major considerations to prevent big data from overloading the organisation’s technology footprint:

1. Infrastructure

A company’s network infrastructure must be able to handle what big data can throw at it. With deficiencies in any area of the infrastructure putting time-to-market and return on investment at risk, infrastructure architects must leverage proactive network test strategies to fully evaluate every decision.

2. Security

It is no longer sufficient to just choose and deploy products designed to address security needs. Companies need to prioritise deeper insight into their overall security resilience, which can be achieved through testing.

3. Visibility

It is only through successful end-to-end visibility that companies can reap the rewards that big data has to offer. IT teams need to get the most out of their monitoring tools by taking full advantage of their core capabilities.

Follow Us

Join the New Zealand Reseller News newsletter!

Error: Please check your email address.

Tags analyticsdataCloudInternet of ThingsIxiabig data

Slideshows

IN PICTURES: Ingram Micro Innovation hits Auckland with Hewlett Packard Enterprise

IN PICTURES: Ingram Micro Innovation hits Auckland with Hewlett Packard Enterprise

Ingram Micro completed its nationwide roadshow in Auckland last month, kicking off its Innovation Hour series with Hewlett Packard Enterprise. Uncovering the latest in storage, networking and servers, the event outlined key market trends for resellers in 2016 and beyond.

IN PICTURES: Ingram Micro Innovation hits Auckland with Hewlett Packard Enterprise
IN PICTURES: FireEye celebrates channel at 2016 Partner Conference

IN PICTURES: FireEye celebrates channel at 2016 Partner Conference

FireEye welcomed 143 channel partners and distributors to FireEye's 2016 annual Partner Conference, FireEye A/NZ Momentum - held at Establishment in Sydney. Delegates heard from senior trans-Tasman channel leaders, marketing and the product divisions in the morning, with FireEye customers, incident responders and threat intelligence analysts sharing knowledge during the afternoon.

IN PICTURES: FireEye celebrates channel at 2016 Partner Conference
Show Comments