Cloud is a good fit for Internet of Things (IoT) because it provides both scale and connectivity to handle lots of sensor readings from large numbers of devices in different locations. The cloud, through platform services also offers tools for processing the information.

But the cloud can also be pretty inconvenient for IoT when the devices being monitored and measured are in remote locations without much connectivity; predictive analytics in the cloud triggered by anomalous sensor readings isn’t much use if the latency is so high that whatever the system trying to predict is going to happen before the data and predictions can get sent back and forth.

Microsoft’s Azure IoT Edge service (currently in preview) uses Docker containers as a way of packaging up key parts of the Azure IoT Hub cloud service to run on anything from a local server cluster to a tiny IoT board like a Raspberry Pi used as IoT gateways. That’s going to work with Azure Functions, using serverless computing locally to trigger actions, as well as taking functionality from Azure Machine Learning and Stream Analytics to use machine learning locally to analyze a stream of sensor data and send alerts.

In the future, that will also include Cognitive Services, Microsoft’s pre-built and customizable deep learning APIs for tasks like vision, text and speech recognition, Azure IoT Director Sam George told InApps Technology. “IoT on the edge is already being used right now but people are having to develop their own custom solutions. What you want to do is to take that intelligence that’s available in the cloud and be able to run it on an IoT device directly, You’re able to take cloud intelligence, so you don’t have to custom develop intelligence for the edge, you can simply move it up to the devices.”

Read More:   What will be the best JavaScript Frameworks in 2022?

Microsoft has been adding data model offload to its Azure Machine Learning service, including for iOS 11. With Azure IoT Edge, the Azure monitoring portal will have options for exporting Docker containers from Azure ML, Functions and Stream Analytics and connecting them up to create sophisticated systems.

Schneider Electric is using Azure IoT Edge with the SCADA systems it sells to monitor remote oil wellheads, which run unattended. “Many of them are in remote locations where they only have a satellite uplink; they use that to send SMS saying ‘the pump broke’ and someone drives out and fixes it,” George explained. With Azure IoT Edge, the company attaches an IoT device to that SCADA system that’s running local machine learning to monitor the telemetry coming off the SCADA system. When the data suggests the pump is going to break in the future and the maintenance team can get someone out there to fix it before it does.

The Schneider Electric system for artificial lift pumps uses Azure Machine Learning and Azure IoT edge to bring predictive analytics to the edge.

“Containers are a natural way to allow not only portability but also reusability so that you can assemble just the right application,” George explained. “Azure IoT Edge apps are assembled from one or more containers and the containers do inter-container communications using the local message broker in IoT Edge.”

While one container is running an adapter to talk to the SCADA system, another container cleansing the data and transforming it into something the machine learning model expects. The company doesn’t currently use Azure Service Fabric orchestration system, “but in future, we will make Service Fabric options available for multi-node” George told us.

Choose Where to Run

The compatibility of containers provides a lot of flexibility for deployment within the Windows/Azure environment. The same machine learning models that run on Azure and can be deployed to IoT Edge devices will also run on Azure Stack, George pointed out; and they will also work on the Machine Learning Server that’s built into SQL Server 2017 (this is also available separately as the latest version of what used to be called Microsoft R Server).

Read More:   Update Deep Information Sciences Offers a Self-Tuning Database System Built on Machine Learning

Using containers also makes IoT device updates more efficient, George explained. IoT Edge keeps a “device twin” for each device so that the latest information from a device is always available even if it’s currently offline (which developers can carry on coding and testing with devices that aren’t connected all the time). It also has a module twin that controls the configuration of each container deployed through the service, whether that’s a machine learning model or an Azure Function.

“When you take a machine learning model that has input variables you can adjust, typically in machine learning you’re adjusting those continually. With Azure IoT Edge, you can not only place that machine learning model on the edge but you can remotely control it with the module twin; if you want to change one of the input variables you can set that in the cloud on the module twin there and we’ll replicate that change down, the machine learning model in container will get notified and can adjust its input variable, and it can turn around back and write back to the module twin so we can see that container has acknowledged and it’s now running in that state,” George said.

“It’s not just about placing workloads; it’s about managing and monitoring them when they’re there. You don’t want to send another half gigabyte model down just to adjust an input parameter.”

Azure IoT Edge isn’t the only way the Azure IoT services use containers; the remote monitoring tools in Azure IoT Hub include some preconfigured solutions. Customers like how easy those are to get started with, George told us but found customizing them hard, so they’ve been rewritten as microservices, both for .NET and Java. “The microservices run in a Kubernetes cluster and you can simply deploy updates to that, so it helps with agility. And If you want to replace part of the solution you can just replace that microservice.”

Read More:   Vietnam needs a legal framework capable of promoting the application and innovation of blockchain products

Microsoft is a sponsor of InApps Technology.

Images via Microsoft.

InApps Technology is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Docker.