Key Takeaways From SDX Central's 2017 “Innovations in Edge Computing and MEC” Report (Part 1)
by SWIM Team, on Nov 2, 2017 11:58:12 AM
Call it the Great Quantification. IOT technologies have made it possible to gather valuable insights from industrial equipment, workers, buildings, and other sources. Decision-makers are now armed with vast amounts of information about how their businesses operate. These real-time edge data insights have the potential to reduce operating costs, improve efficiency, or deliver new capabilities. As businesses increasingly turn to IOT and analytics technologies in order to maximize the efficiency industrial environments, it’s no surprise that Edge Computing has become one of the more hyped buzzwords of 2017. Moore’s Law continues to promise cheaper, faster, and smaller computers, and Edge Computing provides a model for leveraging these small footprint, cost-effective devices in real-world contexts. With massive volumes of data being generated by connected sensors, the Edge Computing model has proven to be an important building block for the industrial analytics applications of the future.
In a previous post, we shared our Key Takeaways from Gartner’s "Hype Cycle for Emerging Technologies" 2017. In this 3 part series, we take a look at SDX Central's 2017 “Innovations in Edge Computing and MEC” report. In part 1 of this series, we’ll discuss the benefits of Edge Computing in industrial contexts. Part 2 addresses use cases for Edge Computing deployment, and in Part 3, we’ll review key considerations for future development of Edge Computing technologies.
However, before we dive into the benefits of Edge Computing, it’s important to acknowledge that Edge Computing exists within the greater data-centric ecosystem. SDX Central explains that “edge computing, fog computing, cloud computing and distributed clouds are related architectures that involve adapting cloud designs from hyperscale web companies to fit edge deployments.” Every use case is different, and so there is no singular reference architecture for industrial data applications. SDX describes that sometimes “edge platforms will run segregated from central clouds but in most cases, these edge platforms will run in conjunction with public or private clouds located in large data centers.” Identifying the right “deployment architectures will take time and will be dependent on the needs of each use case.”
SDX Central defines “seven unique benefits [of Edge Computing] that a centralized data center or centralized cloud cannot offer.” Below, we expand on SDX Central’s findings and identify how Edge Computing can be applied to achieve these benefits in real-world applications.
For time-critical applications, even network latency can mean the difference between timely delivery and “too late.” According to SDX Central, “the edge can provide latency in milliseconds, while the multiple hops and long data transmission distances mean that the latency to the current “edge” of the network i.e. CDNs is in the 50-150ms range. Latency to centralized data centers and the public cloud is even greater.” In cases where low latency is a priority, Edge Computing captures data at the source providing time-critical insights for operators locally, before forwarding reduced data to to centralized systems for post hoc analysis.
2. High throughput
Often, the network is the bottleneck for high-volume data applications. Edge Computing provides a method for processing data at the source, therefore reducing the volume of data that needs to be transmitted over the network. Furthermore, because all Edge Computing occurs in parallel (on each device), this allows edge-based systems to process far higher volumes of data simultaneously. According to SDX Central, “the throughput available to the user from the edge, served via cached or locally generated content, can be orders of magnitude greater than from a core data center or even the CDN.”
3. Data reduction
Similarly, Edge Computing can be used to reduce raw, unstructured data into a more information-dense, structured data format for downstream applications. Industrial data is frequently comprised of duplicate or irrelevant data, with relevant insights obscured in a sea of “noisy” data. SDX explains that “by running applications such as data analytics at the edge, operators and application vendors can substantially cut down the amount of data that has to be sent upstream. This cuts cost and allows for other applications to transfer data.”
4. Context awareness
Edge Computing also provides local access to data context. This means that as new data is processed, devices can compare current state to past device states and make informed judgements about events and anomalies. This can be true of a single device, or multiple devices on the same proximity network. SDX Central points out that just as “the edge has access to radio network, user and location information that is provided by the radio area network (RAN); this information can be used by edge applications.”
“CSPs can protect their networks against attacks from user equipment (UE) or customer premise equipment (CPE) using edge applications,” states SDX Central. By adopting an Edge Computing architecture, devices are decoupled from the network via a digital twin. Initial processing occurs locally, and if a security issue is discovered, the digital twin provides a firewall between the compromised device and the rest of the network. By creating a layer of separation between devices and applications, in cases of device compromise, application integrity can be maintained.
The ability to decouple local networks from the greater application is especially useful in remote and other environments where an internet connection is unavailable or inconsistent. As SDX Central explains, “a number of environments are not always connected to the Internet over high speed links. An edge cloud is able to provide services during periods of degraded or lost connections.” Edge Computing delivers the ability to perform analytics
“Edge applications can help with privacy or data location laws,” says SDX Central. Edge Computing provides a method of retaining data locality, while still making the full dataset available globally. This is especially helpful for maintaining compliance with international regulations such as the EU Data Protection Directive. Because the data for each device is stored on the edge, but accessible for other application services, there is no need to store an entire dataset in the cloud, where it is more vulnerable to breaches.
Though recent innovations in Edge Computing are providing new solutions to edge data problems that have plagued industrial IOT efforts in the past, Edge Computing is simply the application of cloud principles at the extreme of a network. As SDX Central states, “the notion of computing resources being close to users is not a new idea, but the application of cloud computing principles is. Edge computing brings virtualization, self-service APIs and on-demand provisioning/ tear-down. Additionally, edge computing brings orchestration and automated lifecycle management of compute, storage and networking resources.” Simply put, Edge Computing allows application developers to deliver certain functionalities that were exclusive to cloud deployments into proximity networks. This allows for better division of labor, with tasks that make sense to handle locally being dealt with on the edge, while deeper analysis and other tasks that make sense compute centrally being handled in the cloud.
Learn how SWIM can empower industrial applications by using Edge Computing to optimize and alleviate the sensor data overload. Swim ESP can help transform and manage sensor data using Machine Learning on edge devices.