Menu
Google takes on real-time big data analysis with new cloud services

Google takes on real-time big data analysis with new cloud services

Google has unveiled a real-time data processing service and updated its BigQuery analysis tool

Google is betting that real-time processing is the future of big data analysis, and has updated two of its cloud services to help enterprises understand what is happening in the moment with their customers and operations.

"We really believe that streaming is the way the world is going. Instead looking at data from two months or two years ago, the data you really care about is happening right now," said Tom Kershaw, director of product management for the Google Cloud Platform.

Think of the mobile gaming company that wants to know which of its products has gone viral, or the security-sensitive enterprise culling its vast server logs for evidence of the latest security attacks.

To this end, Google has launched a real-time data processing engine called Google Cloud Dataflow, first announced a year ago. It has also added new features to its BigQuery analysis tool, introduced in 2010. The two cloud services can be used together to facilitate the real-time processing of large amounts of data, Kershaw said.

Now available as a beta, Google Cloud Dataflow provides the ability to analyze data as it comes from a live stream of updates. Google takes care of all the hardware provisioning and software configuration, allowing users to ramp up the service without worrying about the underlying infrastructure. The service can also analyze data already stored on disk, in batch mode, allowing an organization to mix historical and current analysis in the same workflow.

The service provides a way "for any Java or Python programmer to write applications using big data," Kershaw said. "It makes it easy to run end-to-end jobs across very complex data sets."

In addition to moving Cloud DataFlow into an open beta program, Google also updated its BigQuery service.

BigQuery provides a SQL (Structured Query Language) interface for large unstructured datasets. SQL is commonly used for traditional relational databases, so it is almost universally understood by database administrators. With this update, Google has improved the service so it can now ingest up to 100,000 rows per second per table.

The company has expanded the footprint of BigQuery so European customers can now use the service. BigQuery data can be stored in Google European data centers, which will help organizations that need to meet the European Union's data sovereignty regulations.

The company has also added row-level permissions to BigQuery, which can limit the accessibility of information based on the user's credentials. This allows organizations to protect portions of the data, such as names and addresses, while allowing wider access to other portions, such as anonymous purchase history, to be used for research or other purposes.

BigQuery and Dataflow can be used in conjunction with each other, Kershaw said. "The two are very much aligned. You can use Cloud Dataflow for processing and BigQuery to analyze," he said.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com


Follow Us

Join the New Zealand Reseller News newsletter!

Error: Please check your email address.

Tags GoogleManaged Servicescloud computinginternet

Featured

Slideshows

Kiwi channel comes together for another round of After Hours

Kiwi channel comes together for another round of After Hours

The channel came together for another round of After Hours, with a bumper crowd of distributors, vendors and partners descending on The Jefferson in Auckland. Photos by Maria Stefina.​

Kiwi channel comes together for another round of After Hours
Consegna comes to town with AWS cloud offerings launch in Auckland

Consegna comes to town with AWS cloud offerings launch in Auckland

Emerging start-up Consegna has officially launched its cloud offerings in the New Zealand market, through a kick-off event held at Seafarers Building in Auckland.​ Founded in June 2016, the Auckland-based business is backed by AWS and supported by a global team of cloud specialists, leveraging global managed services partnerships with Rackspace locally.

Consegna comes to town with AWS cloud offerings launch in Auckland
Veritas honours top performing trans-Tasman partners

Veritas honours top performing trans-Tasman partners

Veritas honoured its top performing partners across the channel in Australia and New Zealand, recognising innovation and excellence on both sides of the Tasman. Revealed under the Vivid lights in Sydney, Intalock claimed the coveted Partner of the Year 2017 (Pacific) award, with Data#3 acknowledged for 12 months of strong growth across the market. Meanwhile, Datacom took home the New Zealand honours, with Global Storage and Insentra winning service provider and consulting awards respectively. Dicker Data was recognised as the standout distributor of the year, while Hitachi Data Systems claimed the alliance partner award. Photos by Bob Seary.

Veritas honours top performing trans-Tasman partners
Show Comments