Menu
As Watson matures, IBM plans more AI hardware and software

As Watson matures, IBM plans more AI hardware and software

IBM releases faster hardware and software that make it easier to train deep-learning systems

Just over five years ago, IBM's Watson supercomputer crushed opponents in the televised quiz show Jeopardy. It was hard to foresee then, but artificial intelligence is now permeating our daily lives.

Since then, IBM has expanded the Watson brand to a cognitive computing package with hardware and software used to diagnose diseases, explore for oil and gas, run scientific computing models, and allow cars to drive autonomously. The company has now announced new AI hardware and software packages.

The original Watson used advanced algorithms and natural language interfaces to find and narrate answers. Then, Watson was one supercomputer, but now AI systems are deployed at a grander scale. Mega data centers run by Facebook, Google, Amazon, and other companies use AI on thousands of servers to recognize images and speech and analyze loads of data.

Watson is just one of IBM's efforts; the company has more initiatives to bring AI to other companies. It's releasing more powerful hardware to make deep-learning systems faster while analyzing data or finding answers to complex questions. IBM is pairing those superfast systems with new software tools.

The new IBM hardware, and software tools called PowerAI, are used to train software to perform AI tasks like image and speech recognition. The more a computer learns, the more accurate the results. Training requires a lot of computing horsepower. The new training hardware is now available.

Ultimately, the hardware and software could be the key cog in making Watson technologies easily available to companies through the cloud or on premises. For now, the company is not talking about whether they will be a part of Watson.

The first set of hardware is the Power8 server with the Nvidia Tesla GPUs, said Sumit Gupta, IBM's vice president of high-performance computing and analytics.

The hardware is the fastest deep-learning system available, Gupta said. The Power8 CPUs and Tesla P100 GPUs are among the fastest chips available, and both are linked via the NVLink interconnect, which outperforms PCI-Express 3.0. Nvidia's GPUs power many deep-learning systems in companies like Google, Facebook, and Baidu.

"Performance is very important as deep learning training jobs run for days," Gupta said. It's also important to speed up key technologies like storage and networking, he said.

The Power8 hardware is available via the Nimbix cloud, which provides bare metal access to hardware and an Infiniband backend.

IBM is also planning hardware and software for inferencing, which requires lighter processing on the edge or end device. The inferencing engine takes results from a trained model, adds additional data or input, and provides improvised results. Drones, robots, and autonomous cars use inferencing engines for navigation, image recognition, or data analysis.

Inferencing chips are also used in data centers to boost deep learning models. Google has created its own chip called TPU (Tensor Processing Unit), and other companies like KnuEdge, Wave Computing, and GraphCore are creating inferencing chips.

IBM is working on a different model for its inferencing hardware and software, Gupta said. He did not provide any further details.

The software is the glue that puts IBM's AI hardware and software in a cohesive package. IBM has forked a version of the open-source Caffe deep-learning framework to function on its Power hardware. IBM is also supporting other frameworks like TensorFlow, Theano, and OpenBLAS.

The frameworks are sandboxes in which users can create and tweak parameters of a computer model that learns to solve a particular problem. Caffe is widely used for image recognition.


Follow Us

Join the New Zealand Reseller News newsletter!

Error: Please check your email address.

Featured

Slideshows

Kiwi channel comes together for another round of After Hours

Kiwi channel comes together for another round of After Hours

The channel came together for another round of After Hours, with a bumper crowd of distributors, vendors and partners descending on The Jefferson in Auckland. Photos by Maria Stefina.​

Kiwi channel comes together for another round of After Hours
Consegna comes to town with AWS cloud offerings launch in Auckland

Consegna comes to town with AWS cloud offerings launch in Auckland

Emerging start-up Consegna has officially launched its cloud offerings in the New Zealand market, through a kick-off event held at Seafarers Building in Auckland.​ Founded in June 2016, the Auckland-based business is backed by AWS and supported by a global team of cloud specialists, leveraging global managed services partnerships with Rackspace locally.

Consegna comes to town with AWS cloud offerings launch in Auckland
Veritas honours top performing trans-Tasman partners

Veritas honours top performing trans-Tasman partners

Veritas honoured its top performing partners across the channel in Australia and New Zealand, recognising innovation and excellence on both sides of the Tasman. Revealed under the Vivid lights in Sydney, Intalock claimed the coveted Partner of the Year 2017 (Pacific) award, with Data#3 acknowledged for 12 months of strong growth across the market. Meanwhile, Datacom took home the New Zealand honours, with Global Storage and Insentra winning service provider and consulting awards respectively. Dicker Data was recognised as the standout distributor of the year, while Hitachi Data Systems claimed the alliance partner award. Photos by Bob Seary.

Veritas honours top performing trans-Tasman partners
Show Comments