4 Real-world Use Cases to Differentiate Edge Computing & Fog Computing (Part 1)
by Brad Johnson, on Dec 1, 2017 10:29:35 AM
This is part 1 of a two part series. Read more about Fog & Edge Computing in Part 2.
Over the past few years, there have been scores of articles discussing the difference between Edge and Fog computing. Though Fog and Edge are distinct concepts, their close proximity to each other in real-world deployments has led to significant confusion, often with the two terms being used interchangeably. Generally speaking, the difference between Fog and Edge computing concerns where data processing capabilities are located in the physical network architecture of an application. The most concise definitions I’ve come across are from an article by Automation Weekly in 2016, which contained interviews with industrial software executives and marketers. According to the article:
- “Fog computing pushes intelligence down to the local area network level of network architecture, processing data in a fog node or IoT gateway.”
- “Edge computing pushes the intelligence, processing power and communication capabilities of an edge gateway or appliance directly into devices like programmable automation controllers (PACs).”
In order to help explain the difference between the two, we describe 4 real-world use cases that may include both Edge and Fog elements:
- Intelligent Transportation Management (ITS)
- Industrial & Commercial Networking
- Smart Metering for Utilities
- Autonomous Vehicles
In Part 1 of this series, we’ll examine Intelligent Transportation Management systems and Industrial & Commercial Networking and highlight where Edge and Fog computing fit in each application. In Part 2, we’ll take a look at Smart Metering and Autonomous Vehicle applications.
Fog & Edge Use Cases
1. Intelligent Transportation Management (ITS)
Smart transportation, especially traffic management, is an ideal application for Edge Computing technologies. By deploying compute intelligence locally, on the physical traffic hardware (controllers, signals, and environmental sensors), redundant or “noisy” data can be reduced at the edge. This significantly reduces the amount of data that needs to transmitted over the network, reducing operating and storage costs. Furthermore, because sensor data is being handled at the edge, insights can be determined without incurring network latency, ensuring that traffic infrastructure can respond to changes in traffic conditions with the lowest possible latency.
So where does the Fog come in? For some Smart Cities, traffic infrastructure may be connected to cellular networks or via a Wi-Fi mesh. In these cases, micro-data centers can be deployed locally, alongside cellular towers or Wi-Fi routers, in order to perform analysis on the aggregate traffic data from multiple edge nodes (controllers, signals, and environmental sensors). Functioning as a “local cloud,” these datacenters are deployed within the local area network, and so they are considered to be a part of “the Fog.”
2. Industrial & Commercial Networking
In today’s technological landscape, hardware companies own the edge. Before data ever gets to the cloud, deployed network hardware touches data first. Similar to the traffic management use case above, industrial and commercial network operators can take advantage of their proximity to the datasource and leverage the Fog, deploying local data centers alongside routers and other network equipment in order to analyze individual router and overall network performance. This enables network operators to throttle down the volume of data on its way to the cloud, lessening the burden on the network and cloud applications. Additionally, both Fog and Edge deployments on routers can apply Machine Learning to optimize channel selection and adapt to changes in the network landscape.
Advances in Edge computing technologies have provided an alternative to locating a data center with each router or network nexus. For hardware manufacturers, the routers themselves are edge devices. Edge agents (software instances) can be deployed on routers in order to analyze router performance locally. This eliminates the need to co-locate server architectures with routers, as the routers themselves are empowered to perform complex analytics. This is ideal for remote or other locations where co-locating a data center may not be possible. Edge agents can be deployed and administered remotely, and do not require additional hardware to support them.
Read more about Fog and Edge Computing use cases in Part 2 of this series.
Learn how SWIM uses Edge Computing to optimize and alleviate the sensor data overload. Swim ESP can help transform and manage sensor data using Machine Learning to intelligently filter and act on data created by edge devices in real-time.