Written by David Linthicum exclusively for Nelson Hilliard
On an Internet of Things (IoT) system running on a public cloud, data frequently needs to be sent from a set of sensors to the database that resides inside of the public cloud.If this sounds problematic, it is.The time it takes for the data to be transferred from the sensor or device to the cloud—that is, the latency—is often too long to meet the latency requirements of the IoT system that may depend upon an immediate response. In some cases, the device manufacturers avoid the public cloud, and IoT systems can’t take advantage of the cost and resource efficiencies of cloud-based computing.
Edge computing came up as an alternative to transmitting every piece of data back to a centralized cloud for processing.Edge computing physically pushes some of the data storage and processing out of the cloud to the edges, typically residing within the device or sensor that collects the data.However, the data and processing is still coupled with processes and data storage systems on the public cloud, and they act as a single virtual unit.This has the advantage of eliminating the latency, thus increasing response, which is extremely helpful with IoT systems.However, it comes at the cost of complexity.
The idea of edge computing isn’t new, of course—we’ve been doing it for years to solve issues with network latency or machine latency.Specifically, there are a few existing concepts to consider, including a cloudlet, which is a new architectural element from CMU that arises from the convergence of mobile computing, and cloud computing.Also, Cisco’s Fog Computing that extends the cloud to be closer to the things that produce and act on IoT data.
The problem is that IoT applications need to react almost instantly to the data generated by a sensor or device, such as shutting down a smelting machine that’s about to overheat and explode. There are hundreds of use cases where reaction time is a key component in an IoT system, which is why latency is such an important concept.Reliability and data processing are critical as well, such as processing data without depending upon communications with a remote cloud-based application.
Thus, edge computing as it relates to cloud computing is becoming a more important concept, and best practice.This frees the cloud application architect from having to send all data back to the public cloud.However, it does not give up the notion that the system that resides in the public cloud, as well as the system on the edge, are separate.The core idea of edge computing is that those two components, while physically distributed, are virtually coupled.
As the edge components move around, they should be automatically located and their data synced with a centralized data store.The data stored in the cloud becomes the single source of truth.However, the data may be stored temporarily at the edge, and processing can occur there as well.
The application of this concept will free many IoT systems from their platform to non-cloud limits.The need for computing at the edge of the cloud becomes more critical as more IoT systems and services find their way to public clouds.Edge computing is something that fills a need as enterprises stand up IoT systems and look for the efficiency of cloud computing.
Remember to Subscribe to our Youtube Channel for the Latest Cloud Computing Tech Jobs, News, and Cloud Shows.
David S. Linthicum is a managing director and chief cloud strategy officer at Deloitte Consulting, and an internationally recognized industry expert and thought leader. Connect with David on LinkedIn and Twitter:
At Nelson Hilliard we specialise in cloud technologies, sourcing the top 20% of cloud professionals inspired to work for you through our specialised marketing and profiling. If you are interested in having a quick talk to me regarding your employment needs please feel free to reach out.