Industrial IOT Is A Data Problem We Can Solve - With Machine Learning
by Simon Crosby, on Jul 24, 2017 11:02:59 AM
In my previous post I made the case that Industrial IOT (IIOT) is not an App nor DevOps problem; It’s a data problem. There aren’t enough app and ops folk who know the language of big-data and cloud to get the job done. And there’s no way to deliver these new stacks in orgs with today’s Information Technology (IT) and Operational Technology (OT) skill-sets.
In this piece I want to make the case for an exciting solution to the challenge of delivering value in IIOT scenarios:
- IIOT is a data problem that we can solve - with machine learning.
The “app first” approach has its roots in the “big-data, public cloud IaaS solves all problems” narrative. And given its remarkable successes, who wouldn’t assume that this is the obvious way to go? But for all the reasons I offered in my previous post, today at least, this approach is failing. I’m not optimistic that this will change. But there is a fascinating alternative that we at Swim have proven time-and-again in our IIOT, Smart Cities, and Smart Grids projects: Instead of slaving away at new apps, we can deliver immediate value to our customers with almost no domain knowledge, by simply learning on their data (on the fly), in context, without the need for complex new cloud stacks and DevOps skills. It’s a story that begins with a simple observation: Moore’s Law has served the edge just as well as it has the cloud. Learning at the edge on cheap devices, close to where the data is produced has many benefits:
- We can learn on brownfield data to short-cut app building, without requiring our customers to rip and replace systems that are already working. A Raspberry Pi can easily learn on dirty data to produce powerful insights of immediate value.
- Using cheap devices, or systems you already own, we can learn locally to deliver immediate insights and permit a real-time response.
- The cost savings are immense. We can learn powerful insights on commodity devices and massively cut bandwidth, storage & cloud processing costs. By way of example, in one of our Smart Cities deployments, a $6,000 per month event processing and learning problem in AWS is transformed to a local learning problem in which we learn on the fly on streamed dirty data, on a device that costs under $500.
- Learning can be used to substantially simplify adoption of new solutions, overcoming the IT/OT skill set problem. To do this our solutions learn about their deployment context. Here’s an example: A Swim asset tracking solution for RFID based manufacturing starts cold and automatically learns from the data the layout of the assembly floor, then deduces how components are aggregated into products so it can track product delivery, instead of nuts and bolts. It does this just by watching and learning.
- Swim uses learning on itself to introspect to ensure security and availability. Once the product is deployed it begins to understand its own behavior and if that behavior ever changes significantly, it can flag an error and even automatically revert to a known secure state.
I’m sure this post has left you with more questions than answers – which admittedly is its goal. However I would welcome an opportunity to dive deep into the “how” with your team if you are considering how local learning can be added to your environment to quickly deliver value to your stakeholders.