Edge Computing Matters, Because Real-Time Matters

by Brad Johnson, on Mar 15, 2018 9:56:08 AM

Remember when downloading a movie took hours to complete? At the time, it seemed like a reasonable thing. Movies are big files, and dial-up or DSL speeds were only so fast. But today? A Netflix movie that needs to buffer at all is cause for anguish. The same thing is true about manufacturing systems, enterprise applications, network infrastructure, and Smart Cities projects. In the past, it was sufficient to collect massive swaths of data for post hoc analysis and then generate time series reports, because that was the expectation. However, as real-world infrastructure and other data sources are brought online, it’s clear that there is untapped value to be captured from analysis and correlation of real-time data streams.

A quick aside: The term “real-time” has come to mean a variety of things in the software world. Further complicating things, there are “near real-time” offerings, which promise results in the range of several minutes or more. For SWIM, we measure real-time in seconds or even milliseconds, depending on the application. In other words, “real-time” should be defined as what’s happening right now, so that perishable insights can be identified and acted upon before it’s too late.

Matters_No URL.png

Correlate Real-time Data Streams to Create New Value

According to Forrester Research, “enterprises have dozens, hundreds, and often thousands of applications that employees and customers use. All these applications generate data that has the potential to be valuable for real-time analysis, especially when combined from multiple sources...this is a key value proposition.” Edge computing systems, which process multiple edge data streams in parallel, enable the correlation of multiple data sources in real-time. No edge device is an island, and every device performs a role that serves a larger system. By combining multiple data streams, enterprises and manufacturers can build a higher order context to better identify anomalies, perform root cause analysis, predict maintenance needs, and identify inefficiencies.

Furthermore, recent innovations in artificial intelligence technologies, such as machine learning and deep neural networks, have enabled edge applications capable of intelligently reducing and filtering opaque, “gray” device data automatically. The ability to filter raw edge data, with minimal human intervention, will help enterprises and manufacturers to reduce the amount of data which must be stored and transmitted to downstream systems. When applied in real-time contexts, machine learning can self-train on streaming data as it flows, continuously testing hypotheses and refining learning models. These models may identify unexpected correlations, bottlenecks, or other opportunities for process optimization. There will always be a need for experienced operators to interpret data. However, edge computing and machine learning can provide access to real-time insights, so that operators are equipped to make decisions faster, more efficiently, and with better information than ever before.

Learn More

Learn how SWIM uses edge intelligence to deliver real-time insights from the dark data generated by connected enterprise systems.

Topics:Machine LearningIndustrial IOTSWIM Inc.Edge AnalyticsManufacturingSWIM AIEdge ComputingFog Computing

More...

Subscribe