Menu
Google's latest big-data tool, Mesa, aims for speed

Google's latest big-data tool, Mesa, aims for speed

Mesa can hold petabytes of data across multiple servers while fielding millions of updates and queries per day

Google has found a way to stretch a data warehouse across multiple data centers, using an architecture its engineers developed that could pave the way for much larger, more reliable and more responsive cloud-based analysis systems.

Google researchers will discuss the new technology, called Mesa, at the Conference on Very Large Data Bases, happening next month in Hangzhou, China.

A Mesa implementation can hold petabytes of data, update millions of rows of data per second and field trillions of queries per day, Google says. Extending Mesa across multiple datacentres allows the data warehouse to keep working even if one of the data centers fails.

Google built Mesa to store and analyze critical measurement data for its Internet advertising business, but the technology could be used for other, similar data warehouse jobs, the researchers said.

"Mesa ingests data generated by upstream services, aggregates and persists the data internally, and serves the data via user queries," the researchers wrote in a paper describing Mesa.

For Google, Mesa solved a number of operational issues that traditional enterprise data warehouses and other data analysis systems could not.

For one, most commercial data warehouses do not continuously update the data sets, but more typically update them once a day or once a week. Google needed its streams of new data to be analyzed as soon as they were created.

Google also needed a strong consistency for its queries, meaning a query should produce the same result from the same source each time, no matter which data center fields the query.

Consistency is typically considered a strength of relational database systems, though relational databases can have a hard time ingesting petabytes of data. It's especially hard if the database is replicated across multiple severs in a cluster, which enterprises do to boost responsiveness and uptime. NoSQL databases, such as Cassandra, can easily ingest that much data, but Google needed a greater level of consistency than these technologies can typically offer.

The Google researchers said that no commercial or existing open-source software was able to meet all of its requirements, so they created Mesa.

Mesa relies on a number of other technologies developed by the company, including the Colossus distributed file system, the BigTable distributed data storage system and the MapReduce data analysis framework. To help with consistency, Google engineers deployed a homegrown technology called Paxos, a distributed synchronization protocol.

In addition to scalability and consistency, Mesa offers another advantage in that it can run be run on generic servers, which eliminates the need for specialized, expensive hardware. As a result, Mesa can be run as a cloud service and easily scaled up or down to meet the job requirements.

Mesa is the latest in a series of novel data-processing applications and architectures that Google has developed to serve its business.

Some Google innovations have gone on to provide the foundations for widely used applications. For example, BigTable led to the development of Apache Hadoop.

Other Google technologies developed for internal use have subsequently been offered as cloud services from the company itself. Google's Dremel ad-hoc query system for read-only data went on to become a foundation of the company's BigQuery service.

Future commercial prospects for Mesa may be somewhat limited, however, said Curt Monash, head of database research firm Monash Research.

Not many organizations today would need sub-second response times against a body of material as large and complex as Google's, Monash said in an email. Also, MapReduce is not the most efficient way of handling relational queries. That's what's led to a number of SQL-on-Hadoop technologies, such as Hive, Impala and Shark.

Also, typical enterprises should look for commercial or open-source options to keep their data warehouses consistent across data centers before adopting what Google's developed, Monash said. Most new data stores being developed today have some form of multi-version currency control (MVCC), he said.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com


Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags Googlesoftwareapplicationsdata warehousing

Featured

Slideshows

Reseller News kicks off awards season in 2019 with Judges' Lunch

Reseller News kicks off awards season in 2019 with Judges' Lunch

The 2019 Reseller News Innovation Awards has kicked off with the Judges Lunch in Auckland with 70 judges in the voting panel. The awards will reflect the changing dynamics of the channel, recognising excellence across customer value and innovation - spanning start-ups, partners, distributors and vendors. Photos by Christine Wong.

Reseller News kicks off awards season in 2019 with Judges' Lunch
Reseller News welcomes industry figures for 2019 Hall of Fame lunch

Reseller News welcomes industry figures for 2019 Hall of Fame lunch

Reseller News welcomed 2018 inductees - Chris Simpson, Kendra Ross and Phill Patton - to the third running of the Reseller News Hall of Fame lunch, held at the French Cafe in Auckland. The inductees discussed the changing landscape of the technology industry in New Zealand, while outlining ways to attract a new breed of players to the ecosystem. Photos by Gino Demeer.

Reseller News welcomes industry figures for 2019 Hall of Fame lunch
Upcoming tech talent share insights at inaugural Emerging Leaders Forum 2019

Upcoming tech talent share insights at inaugural Emerging Leaders Forum 2019

The channel came together for the inaugural Reseller News Emerging Leaders Forum in New Zealand, created to provide a program that identifies, educates and showcases the upcoming talent of the ICT industry. Hosted as a half day event, attendees heard from industry champions as keynoters and panelists talked about future opportunities and leadership paths and joined mentoring sessions with members of the ICT industry Hall of Fame. The forum concluded with 30 Under 30 Tech Awards across areas of Sales, Entrepreneur, Marketing, Management, Technical and Human Resources. Photos by Gino Demeer.

Upcoming tech talent share insights at inaugural Emerging Leaders Forum 2019
Show Comments