Menu
Azure’s new machine learning features embrace Python

Azure’s new machine learning features embrace Python

Python is a staple language for machine learning

Credit: Dreamstime

Microsoft has unveiled several new additions to its Azure ML offering for machine learning, including better integration with Python and automated self-tuning features for faster model development.

Python is a staple language for machine learning, thanks to its low barrier to entry and its wide range of machine learning libraries and support tools.

Azure’s offering with Python is a new SDK that lets Azure ML connect to a developer’s existing Python environment.

This SDK comes with the azureml-sdk package that can be installed using Python’s pip package manager. Most Python environments, from a generic Python install to data-science-specific distributions like Anaconda Python or a Jupyter notebook, can connect to Azure ML this way.

Tools offered through the SDK include data preparationlogging the results of experiment runssaving and retrieving experiment data from Azure blob storageautomatically distributing model training across multiple nodes, and ways to automatically create various execution environments for jobs, such as remote VMs, Docker containers, and Anaconda environments.

Another new Azure ML feature supported by the new Python SDK is automated machine learning.

The underlying concept isn’t new—it’s a form of hyper-parameter optimisation, or a way to automatically tune the parameters used for a particular machine learning model training system to yield better results.

Microsoft describes it as “a recommender system for machine learning pipelines. Similar to how streaming services recommend movies for users, automated machine learning recommends machine learning pipelines for data sets.”

Redmond also claims the automation can be done without looking directly at sensitive data, and thus preserve users’ privacy.

Other new features include distributed deep leaning, to allow models to be automatically trained on a cluster of machines without having to configure the cluster, in addition to hardware-accelerated inferencing, which uses FPGAs to speed the serving of inferences from models.

Also available is model management via CI/CD, so that Docker containers can be used to manage trained models.


Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags CloudMicrosoftpythonazuremachine learning

Featured

Slideshows

Tech industry comes out in force as Lancom turns 30

Tech industry comes out in force as Lancom turns 30

A host of leading vendors and customers came together to celebrate the birthday of Lancom Technology in New Zealand, as the technology provider turned 30.

Tech industry comes out in force as Lancom turns 30
The making of an MSSP: a blueprint for growth in NZ

The making of an MSSP: a blueprint for growth in NZ

Partners are actively building out security practices and services to match, yet remain challenged by a lack of guidance in the market. This exclusive Reseller News Roundtable - in association with Sophos - assessed the making of an MSSP, outlining the blueprint for growth and how partners can differentiate in New Zealand.

The making of an MSSP: a blueprint for growth in NZ
Show Comments