From M2M to IoT: Old industries have to learn new tricks

From M2M to IoT: Old industries have to learn new tricks

IoT offers advantages over older connected machines, but updating may not be easy

IoT systems from Kore are used in industrial settings like this.

IoT systems from Kore are used in industrial settings like this.

The Internet of Things may be a new idea, but machines talking to other machines is not.

The way connected devices are evolving in industry, where they've been used for years, says a lot about why IoT has the potential to be something very big. But within enterprises, that evolution can also present some challenges.

It's a common but sometimes overlooked part of the shiny new world of IoT. Most opportunities to deploy enterprise IoT are in so-called brownfield environments, organizations where some form of connected device is already in place. Only 2 percent of those installed systems have been upgraded to IoT so far, Cisco Systems estimates.

Manufacturers, utilities, oil companies and other enterprises, typically ones with many widely dispersed physical assets, have used connected devices to monitor their operations and remotely control infrastructure for 20 years or more.

"Anything with a lot of machines has had what I'll just generically call 'connected machines' for a long time," Gartner analyst Hung LeHong said.

Factory motors have thermometers that continually report readings to management software. Telemetry systems track vehicle location and performance. SCADA (supervisory control and data acquisition) systems have been used for years in oil refineries, power distribution networks, water treatment plants and other large facilities and networks. All can make enterprises more efficient and prevent losses and unexpected failures, which can be both expensive and dangerous.

Those earlier technologies, sometimes called M2M (machine-to-machine), are typically linked only to local or private networks. About 80 percent of the connected machines in place today aren't on the Internet, according to Ido Sarig, vice president and general manager of IoT Solutions at Wind River, Intel's embedded software subsidiary.

M2M systems typically are reliable, resilient, and designed to keep working for many years. But most have been purpose-built for one job in one setting, relying on specific hardware, software and networks. This vertical integration can lock customers into one vendor that, in turn, is limited to developing and producing products for a relatively small market, said Bill Bien, a partner at Waterstone Management Group, a consulting firm that has advised many enterprises on connected device strategies.

"Now, with the Internet of Things, you're disrupting that hard relationship," he said.

IoT is built around IP (Internet Protocol) and horizontal layers of hardware, software and connectivity where technologies from many different vendors can come into play. That opens up new possibilities even as it drives costs down.

Smaller and cheaper hardware, made possible by advances in chip design, is helping to drive the change. Cost often dictates use, and in enterprise M2M, high hardware costs have meant selective deployments.

For example, an automaker that Waterstone advised could only afford to put sensors in the most critical and expensive parts in a factory, such as boilers, motors and pumps. It wired up those systems to detect impending breakdowns because such a failure would be most disruptive to the plant, said Hubert Selvanathan, a principal at Waterstone.

But the automaker also relied heavily on ball bearings used in the miles of conveyor belts around the factory, which could fail due to wear or overheating and bring the whole assembly line to a halt. When smaller, less expensive sensors became available, those ball bearings could be instrumented to regularly report their condition so the company knew when to replace them.

Once data is collected, companies now have more flexible tools to analyze it and combine it with other sources. Where companies previously had to make the data from devices match a proprietary application's format, now they can apply open APIs (application programming interfaces) to take advantage of more generalized platforms.

Fragmented, industry-specific standards have given way to more open technologies that are widely used, such as the MQTT (Message Queuing Telemetry Transport) protocol for ingesting IoT data, and SAP Hana and Hadoop clusters for analytics. Hana, for example, gives users a way to bring IoT data into a company's ERP (enterprise resource planning) platform. Physical gateways, such as those made by Intel and Cisco, may translate between the legacy systems and newer protocols.

Cloud-based analytics can open up new possibilities for connected machines. For example, Daikin Applied, a maker of HVAC (heating, ventilation and air conditioning) systems, equips its big rooftop units with numerous sensors, according to Sarig from Wind River. In the past, technicians periodically went up on the roof and used thumb drives to collect historical data stored in the HVAC unit.

Using a gateway developed by Intel, Daikin linked those sensors to the Internet. Now, owners of the HVAC systems can continuously send the sensor data to the cloud, where it combines that with weather forecasts and information about demand-based local electricity rates. A Daikin-developed algorithm can analyze those streams of data and tell the system when to cool off a building ahead of time so peak rates can be avoided, Sarig said.

Broader and faster networks have also expanded the possibilities. For example, in the past, a company that monitors oil and gas infrastructure relied on pipeline sensors that could only use the signalling channels of cellular networks, according to Kore Wireless Group, a longtime M2M services company. That channel, which is also used for SMS (Short Message Service), could only carry a few bytes of data at at time, so the system was limited to simple "on" and "off" messages.

As cellular networks evolved to 2G and 3G data services, the monitoring company installed smarter sensors that could detect and report how much was leaking from the pipeline, Kore CEO Alex Brisbourne said. With 4G, it could be possible to remotely switch on a video camera and view the damage, helping an operator decide what kind of crew to send in response, he said.

But as promising as these new technologies are, moving on from older ones can be hard. IoT has to match the capabilities of legacy M2M to gain users' confidence, and the change could even shake up organizations.

Purpose-built systems on private networks have both security and reliability characteristics that aren't inherently there on the Internet or commercial networks, Gartner's LeHong said. If designed right, a pipeline valve connected to a proprietary wired network will be safe from intruders and always close right when you send the command from the proprietary control.

Legacy industrial systems have had some security shortcomings of their own, such as the Stuxnet worm that attacked SCADA systems and the Target point-of-sale breach in which hackers took advantage of an HVAC system. But Internet-based technologies may need some work to match all the qualities of the older systems, LeHong said.

"Everything has to work as if it were a closed, private environment, or better. ... We can't just put mission-critical use cases on the Internet" without additional steps to harden the systems, he said.

Industrial vendors such as General Electric and IT companies such as Cisco and IBM are working to bridge the gap in security and reliability, with organizations such as the Industrial Internet Consortium seeking to align some of those efforts. Cisco and other vendors have started to introduce application-enablement platforms.

But along with the technological leap, most enterprises will have to address organizational issues, Cisco and others say.

Companies with a lot of systems for making, processing or transporting things have long managed those separately from IT systems that handle data. Operations people chose the M2M platforms to run their factories or oil rigs, managed the systems themselves and consumed the data they gathered through specialized consoles.

Now that physical operations can benefit from assets that came from the other side of that divide, such as IP networks, mass-produced chips and cloud computing, operations staff and IT have to learn a complicated dance together.

"They kind of don't trust each other," said Maciej Kranz, vice president and general manager of Cisco's corporate technology group. Both sides may be resistant to change, he said. Cisco's educational division recently moved to help bridge that gap by offering a specialization in industrial networking.

Enterprises that want to move from one generation of connected things to the next probably won't do it in one leap. Legacy M2M systems are often too widely dispersed and too deeply embedded to replace economically, which is why they're built to be used for many years.

"You're not going to change your gas turbine engine just to get more sensors," LeHong at Gartner said. But it might be possible to retrofit that engine with more sensors, at a lower cost, if the additional data would make a factory more efficiently, with more preventive maintenance and fewer unplanned outages. Those are the calculations that companies have to make.

Other legacy systems might even be hard to update. But new analytics software can do more with the data those sources have been supplying for years, Selvanathan at Waterstone said. Even if the data resides in a closed silo, such as a repository kept used by plant historian software, companies may be able to crunch the numbers again using a platform like Hana.

"The good news is, yes, there's a way to get the data out. The not-so-great news is that it's non-trivial to do that," Selvanathan said. Data extraction, migration and loading takes new infrastructure and system-integration skills that most organizations would have to hire in from outside, he said -- but that's likely to cost a lot less than scrapping what's already there.

Stephen Lawson covers mobile, storage and networking technologies for The IDG News Service. Follow Stephen on Twitter at @sdlawsonmedia. Stephen's e-mail address is

Follow Us

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags internetNetworkingintelCisco SystemsKore TelematicsWaterstone Management Group



How MSPs can capitalise on integrating AI into existing services

How MSPs can capitalise on integrating AI into existing services

​Given the pace of change, scale of digitalisation and evolution of generative AI, partners must get ahead of the trends to capture the best use of innovative AI solutions to develop new service opportunities. For MSPs, integrating AI capabilities into existing service portfolios can unlock enhancements in key areas including managed hosting, cloud computing and data centre management. This exclusive Reseller News roundtable in association with rhipe, a Crayon company and VMware, focused on how partners can integrate generative AI solutions into existing service offerings and unlocking new revenue streams.

How MSPs can capitalise on integrating AI into existing services
Access4 holds inaugural A/NZ Annual Conference

Access4 holds inaugural A/NZ Annual Conference

​Access4 held its inaugural Annual Conference in Port Douglass, Queensland, for Australia and New Zealand from 9-11 October, hosting partners from across the region with presentations on Access4 product updates, its 2023 Partner of the Year awards and more.

Access4 holds inaugural A/NZ Annual Conference
Show Comments