Often when I visit enterprise users of industrial automation – for example in manufacturing or retail – I find tremendous excitement on the part of ops teams at the potential for IOT solutions to deliver value. But when I dig deeper into their requirements or ask how they would measure the value of an IOT solution, the room falls silent. Ops teams are too busy meeting delivery schedules or production targets, and they struggle to imagine the possibilities of an environment transformed by machine learning and real-time insights and control. They can’t begin to imagine how new technology could transform their organization’s productivity and make their lives easier.
Many organizations I meet face productivity challenges that result from rapid growth or limited budgets. Ops staff don’t have the luxury of time to imagine an easier job or a more productive environment – they are too busy trying to keep things running. And they are drowning in data. Data dominates the lives of many production teams because production facilities, equipment and products are increasingly electronically tagged, or actively engaged in control or instrumentation. Ops folk are the data faithful: Data can help to find problems, diagnose failures, explain hiccups or downtime. It can enable them to identify opportunities for improvement. Ops folk know this to be true, they passionately believe it, and so they save all of it – or as much as they can. Whether or not the data is useful in practice is an afterthought. Having reams of data is good because “IF …”
Like Ops teams that have hard won experience to convince them of the value of data, I know that the value of data is enormous. But I want to challenge the post-facto notion of the value of data and in so doing find a way to escape the tedious, expensive and complex collection, storage and analysis challenges that IT (Information Technology) and OT (Operational Technology) teams face today: The data is not valuable per-se – valuable insights are computed from it. But there is no reason why insights should be computed post-hoc or that ephemeral data should be collected and stored forever.
- If you can compute over real-time, time-series related data on the fly, the insights derived from it can be made available 24x7 and used as input to other applications or delivered to operators that can use them to make on-the-ground decisions. Real-time insights are of huge value to operators seeking to maximize production, minimize downtime or otherwise maximize the yield from their assets.
- Instead of using stored data to compute post-hoc insights, there is no need to store ephemeral data because the valuable insight is what’s needed, and it is continually available. Data management problems and storage infrastructure costs just disappear. There is no need for a “big data” solution because there isn’t one: The transformation from “data” to “insight” delivers infrastructure cost savings in bandwidth, compute and storage of many orders of magnitude. By way of example, when Swim learns on real-time traffic sensor data to predict the future state of an intersection, we readily get a reduction in data volume of a factor of 10,000. Hundreds of MB/s turn into tens of KB/s.
So real-time, always on “insights” are the key to delivering solution value, and to minimizing the cost of the infrastructure needed to achieve it. But there’s still a nagging problem: Who writes the app that consumes the data to deliver the insights? This is where most “IOT Solution Vendors” whip out their professional services teams to build a custom application to consume the data. But this is problematic in many ways. At the very least it’s costly and the “app” delivers specified outputs. But worse, it requires that someone define the app, understand its value, and commit money to build it before proving that the insights are of value. And we know we have an imagination challenged environment because ops teams don’t know the value of different insights, and we have a budget challenge because we have to build the app to get the insights.
A solution: At Swim we listen to data and build apps from it, on the fly. We just need data, and the more we get, the better. We learn across the data to find useful insights, and present them to users in real time, all the time. We let operators pick the insights of value to them, and we focus on learning in-stream and across stream to pick out the faintest signals that could be profoundly valuable, automatically.
Learn more about Swim ESPTM.