Menu
Google takes on real-time big data analysis with new cloud services

Google takes on real-time big data analysis with new cloud services

Google has unveiled a real-time data processing service and updated its BigQuery analysis tool

Google is betting that real-time processing is the future of big data analysis, and has updated two of its cloud services to help enterprises understand what is happening in the moment with their customers and operations.

"We really believe that streaming is the way the world is going. Instead looking at data from two months or two years ago, the data you really care about is happening right now," said Tom Kershaw, director of product management for the Google Cloud Platform.

Think of the mobile gaming company that wants to know which of its products has gone viral, or the security-sensitive enterprise culling its vast server logs for evidence of the latest security attacks.

To this end, Google has launched a real-time data processing engine called Google Cloud Dataflow, first announced a year ago. It has also added new features to its BigQuery analysis tool, introduced in 2010. The two cloud services can be used together to facilitate the real-time processing of large amounts of data, Kershaw said.

Now available as a beta, Google Cloud Dataflow provides the ability to analyze data as it comes from a live stream of updates. Google takes care of all the hardware provisioning and software configuration, allowing users to ramp up the service without worrying about the underlying infrastructure. The service can also analyze data already stored on disk, in batch mode, allowing an organization to mix historical and current analysis in the same workflow.

The service provides a way "for any Java or Python programmer to write applications using big data," Kershaw said. "It makes it easy to run end-to-end jobs across very complex data sets."

In addition to moving Cloud DataFlow into an open beta program, Google also updated its BigQuery service.

BigQuery provides a SQL (Structured Query Language) interface for large unstructured datasets. SQL is commonly used for traditional relational databases, so it is almost universally understood by database administrators. With this update, Google has improved the service so it can now ingest up to 100,000 rows per second per table.

The company has expanded the footprint of BigQuery so European customers can now use the service. BigQuery data can be stored in Google European data centers, which will help organizations that need to meet the European Union's data sovereignty regulations.

The company has also added row-level permissions to BigQuery, which can limit the accessibility of information based on the user's credentials. This allows organizations to protect portions of the data, such as names and addresses, while allowing wider access to other portions, such as anonymous purchase history, to be used for research or other purposes.

BigQuery and Dataflow can be used in conjunction with each other, Kershaw said. "The two are very much aligned. You can use Cloud Dataflow for processing and BigQuery to analyze," he said.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com

Follow Us

Join the New Zealand Reseller News newsletter!

Error: Please check your email address.

Tags GoogleManaged Servicescloud computinginternet

Featured

Slideshows

Educating from the epicentre - Why distributors are the pulse checkers of the channel

Educating from the epicentre - Why distributors are the pulse checkers of the channel

​As the channel changes and industry voices deepen, the need for clarity and insight heightens. Market misconceptions talk of an “under pressure” distribution space, with competitors in that fateful “race for relevance” across New Zealand. Amidst the cliched assumptions however, distribution is once again showing its strength, as a force to be listened to, rather than questioned. Traditionally, the role was born out of a need for vendors and resellers to find one another, acting as a bridge between the testing lab and the marketplace. Yet despite new technologies and business approaches shaking the channel to its very core, distributors remain tied to the epicentre - providing the voice of reason amidst a seismic industry shift. In looking across both sides of the vendor and partner fences, the middle concept of the three-tier chain remains centrally placed to understand the metrics of two differing worlds, as the continual pulse checkers of the local channel. This exclusive Reseller News Roundtable, in association with Dicker Data and rhipe, examined the pivotal role of distribution in understanding the health of the channel, educating from the epicentre as the market transforms at a rapid rate.

Educating from the epicentre - Why distributors are the pulse checkers of the channel
Kiwi channel reunites as After Hours kicks off 2017

Kiwi channel reunites as After Hours kicks off 2017

After Hours made a welcome return to the channel social calendar last night, with a bumper crowd of distributors, vendors and resellers descending on The Jefferson in Auckland to kickstart 2017. Photos by Maria Stefina.

Kiwi channel reunites as After Hours kicks off 2017
Arrow exclusively introduces Tenable Network Security to A/NZ channel

Arrow exclusively introduces Tenable Network Security to A/NZ channel

Arrow Electronics introduced Tenable Network Security to local resellers in Sydney last week, officially launching the distributor's latest security partnership across Australia and New Zealand. Representing the first direct distribution agreement locally for Tenable specifically, the deal sees Arrow deliver security solutions directly to mid-market and enterprise channel partners on both sides of the Tasman.

Arrow exclusively introduces Tenable Network Security to A/NZ channel
Show Comments