Menu
Microsoft to offer three new ways to store big data on Azure

Microsoft to offer three new ways to store big data on Azure

Azure to feature a data warehouse, a 'data lake' and the ability to pool multiple databases

Microsoft Azure's new Data Lake architecture

Microsoft Azure's new Data Lake architecture

Microsoft will soon offer three additional ways for enterprises to store data on Azure, making the cloud computing platform more supportive of big data analysis.

Azure will have a data warehouse service, a "data lake" service storing large amounts of data, and an option for running "elastic" databases that can store sets of data that vary greatly in size, explained Scott Guthrie, Microsoft executive vice president of the cloud and enterprise group, who unveiled these new services at the company's Build 2015 developer conference, held this week in San Francisco.

The Azure SQL Data Warehouse, available later this year, will give organizations a way to store petabytes of data so it can be easily ingested by data analysis software, such as the company's Power BI tool for data visualization, the Azure Data Factory for data orchestration, or the Azure Machine Learning service.

Unlike traditional in-house data warehouse systems, this cloud service can quickly be adjusted to fit the amount of data that actually needs to be stored, Guthrie said. Users can also specify the exact amount of processing power they'll need to analyze the data. The service builds on the massively parallel processing architecture that Microsoft developed for its SQL Server database.

The Azure Data Lake has been designed for those organizations that need to store very large amounts of data, so it can be processed by Hadoop and other "big data" analysis platforms. This service could be most useful for Internet of Things-based systems that may amass large amounts of sensor data.

"It allows you to store literally an infinite amount of data, and it allows you to keep data in its original form," Guthrie said. The Data Lake uses Hadoop Distributed File System (HDFS), so it can be deployed by Hadoop or other big data analysis systems.

A preview of the Azure Data Lake will be available later this year.

In addition to these two new products, the company has also updated its Azure SQL Database service so customers can pool their Azure cloud databases to reduce storage costs and prepare for bursts of database activity.

"It allows you to manage lots of databases at lower cost," Guthrie said. "You can maintain completely isolated databases, but allows you to aggregate all of the resources necessary to run those databases."

The new service would be particularly useful for running public-facing software services, where the amount of database storage needed can greatly fluctuate. Today, most Software-as-a-Service (SaaS) offerings must overprovision their databases to accommodate the potential peak demand, which can be financially wasteful. The elastic option allows an organization to pool the available storage space for all of its databases in such a way that if one database rapidly grows, it can pull unused space from other databases.

The new elastic pooling feature is now available in preview mode.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com

Follow Us

Join the New Zealand Reseller News newsletter!

Error: Please check your email address.

Tags applicationsMicrosoftdata miningsoftwarecloud computinginternetInfrastructure services

Featured

Slideshows

Educating from the epicentre - Why distributors are the pulse checkers of the channel

Educating from the epicentre - Why distributors are the pulse checkers of the channel

​As the channel changes and industry voices deepen, the need for clarity and insight heightens. Market misconceptions talk of an “under pressure” distribution space, with competitors in that fateful “race for relevance” across New Zealand. Amidst the cliched assumptions however, distribution is once again showing its strength, as a force to be listened to, rather than questioned. Traditionally, the role was born out of a need for vendors and resellers to find one another, acting as a bridge between the testing lab and the marketplace. Yet despite new technologies and business approaches shaking the channel to its very core, distributors remain tied to the epicentre - providing the voice of reason amidst a seismic industry shift. In looking across both sides of the vendor and partner fences, the middle concept of the three-tier chain remains centrally placed to understand the metrics of two differing worlds, as the continual pulse checkers of the local channel. This exclusive Reseller News Roundtable, in association with Dicker Data and rhipe, examined the pivotal role of distribution in understanding the health of the channel, educating from the epicentre as the market transforms at a rapid rate.

Educating from the epicentre - Why distributors are the pulse checkers of the channel
Kiwi channel reunites as After Hours kicks off 2017

Kiwi channel reunites as After Hours kicks off 2017

After Hours made a welcome return to the channel social calendar last night, with a bumper crowd of distributors, vendors and resellers descending on The Jefferson in Auckland to kickstart 2017. Photos by Maria Stefina.

Kiwi channel reunites as After Hours kicks off 2017
Arrow exclusively introduces Tenable Network Security to A/NZ channel

Arrow exclusively introduces Tenable Network Security to A/NZ channel

Arrow Electronics introduced Tenable Network Security to local resellers in Sydney last week, officially launching the distributor's latest security partnership across Australia and New Zealand. Representing the first direct distribution agreement locally for Tenable specifically, the deal sees Arrow deliver security solutions directly to mid-market and enterprise channel partners on both sides of the Tasman.

Arrow exclusively introduces Tenable Network Security to A/NZ channel
Show Comments