Building Digital Twins from Data

by Simon Crosby, on Apr 6, 2018 10:30:14 AM

SWIM EDX builds "digital twins" that learn on-the-fly, training themselves from real world data so they can predict future behavior.  

Our approach is the first that I’m aware of in which a digital twin is created dynamically, from the data alone. It also brings to market a first in real-time machine learning – each digital twin self-trains locally a DNN (deep neural network) model of the real-world object it represents so it can analyze current activity and predict its future performance.

SWIM is the brainchild of Rusty Cumpston and Chris Sachs – brilliant and extraordinarily accomplished entrepreneurs.  In this blog I intend to chronicle our journey of learning at the edge, to dive deeper into SWIM itself, to explore the limits of ML in our use cases, and to explain some of our product decisions.   I hope it’s a useful place to learn more.

In this blog I want to unpick the concept of a digital twin, which is (as usual in tech) an overloaded term that sounds cool and begs for more detail.  The Wikipedia definition of a digital twin is pretty good:

“Digital twins integrate artificial intelligencemachine learning and software analytics with data to create living digital simulation models that updates and changes as their physical counterparts change. A digital twin continuously learns and updates itself from multiple sources to represent its near real-time status, working condition or position.”

SWIM_Illustration 3 R4_Outlines-1SWIM EDX creates "digital twins" from streaming edge data.

The obvious question is, “Who creates the digital twin?”.  Until now twins have been developed when the physical object is designed - simulations that, given real-world data, attempt to digitally represent the state of their physical sibling based on its original design and expected performance.  But while that’s a useful construct for new systems, what are we to do for the trillions of dollars of installed “things” - from machine sensors through traffic intersections to cookie factories – that won’t be replaced and for which the simulated twin model doesn't yet exist and is extremely difficult to recreate?  Let’s be specific : Imagine I sat you down at a traffic intersection with a computer and asked you to write a program that behaves like that intersection.   Very tough task, for many reasons, not least of which is the complexity of the “black box” model you're observing – all you get to see is how it behaves, but not understand how it works or why it makes certain decisions.

SWIM EDX and Digital Twins

SWIM EDX dynamically builds a new digital twin for each physical entity (and each digital twin then becomes an actor in the SWIM distributed actor fabric).  Each digital twin receives and processes data from its real-world sibling and self-trains a model based on how the system actually behaves, from the data alone.  This allows it to predict how the system will behave in the future, given its current state.  It also gives us a measure of the accuracy of the predictions – which is useful to a consumer of the insights.  This approach allows us to create digital twins for complex systems from their data alone.    Going back to the challenge I set you at the intersection:   Since the twin for that intersection is a program (in the form of weights in the DNN) in essence we have learned a black box behavioral model for the system through observation, hypotheses and feedback.  SWIM EDX has the advantage that it is effectively building lots of small models for each of the objects it sees (enabling it to do so locally on existing devices) and that those models rapidly self-train from the actual data and rapidly learning from the results of 'what happens next' to refine its predictions.

Now SWIM EDX isn't a universal solution - SWIM EDX can be applied to 'gray' edge data which is numerical - even handling very large volumes - but we need time based data for our approach to work.  Since we don't store the data if there's a regulatory requirement to do so then we'd work alongside existing historian systems.  Also SWIM isnt the right solution for very tough problems like image recognition, natural language processing which require immense compute power.  However for the 95% of streaming edge data which is generated by smart devices, sensors, edge devices and existing systems SWIM can deliver significant reductions in data volume, identify critical events, find hidden patterns and help predict future behavior for key data.

Learn More

Learn how SWIM EDX uses edge intelligence to deliver real-time insights from the dark data generated by connected enterprise systems.

Topics:Machine LearningStateful ApplicationsSWIM SoftwareEdge AnalyticsDigital TwinSWIM AIEdge ComputingSWIM EDX

Comments