What Is Edge Computing?

Bringing computation and data storage closer together to improve response time and save bandwidth is the goal of edge computing. In this blog post, I want to figure out why this emerging technology has gained momentum over the last couple of years. We'll discuss edge computing and how it facilitates the deployment of a broader range of applications. Furthermore, I'd like to highlight how edge computing differs from cloud computing and use cases of edge computing devices and applications.

Edge computing is viewed as an evolution from delivery networks. Established in the 1990s, these geographically distributed networks (between proxy servers and data centers) served to bring content closer to the end-user. At the beginning of the 2000s, these networks developed to host applications and application components at the edge servers, resulting in the first commercial edge computing services that hosted applications such as dealer locators and shopping carts and real-time data aggregators, and ad insertion engines. Modern edge computing significantly extends this approach through virtualization technology that makes it easier to deploy and run a broader range of applications on the edge servers.

Why Does Edge Computing Matter?

To answer this question, let's take a broader definition of edge computing and go from there: edge computing can be defined as all computing that happens outside the cloud and at the edge of the network. With this new computing paradigm, problems arising out of centralized big data processing are counteracted with edge computing.

Rather than transmitting data via a data center, edge computing enables devices to process data itself or by a local computer. Therefore, it can potentially address the limitations of computing capabilities, bandwidth issues, and response times. This alone can save companies a lot of money, since cloud computing costs in bandwidth are sometimes much higher than they initially expected.

The biggest advantage of edge computing is that it allows smart applications and devices to respond to data almost right away. This enables your real-time applications on your smartphone to act more efficiently: before, a cell phone taking a picture of your face for facial recognition would need to run the calculations through a cloud-based service, using a lot of battery power and time in the process. With an edge computing model, however, these computations run locally on an edge server or gateway, or even on the cell phone itself, given your smartphone has the power needed. Applications such as virtual and augmented reality, self-driving vehicles, smart cities, and industrial automation applications increasingly rely on edge computing for its fast processing and response capability.

Are we living in the Era of the Internet of Everything?

With the installation of 5G networks and the proliferation of IoT, many believe we have moved to a new era called the Internet of Everything (IoE). While the initial goal for the invention of edge computing was to reduce bandwidth costs for the Internet of Things (IoT) devices, experts say future growth will come from the need for more real-time applications requiring local processing and storage capabilities. Therefore, in the coming years, vast volumes of data generated by things surrounding us will be more and more a part of our daily life. Consequently, leading to the implementation of a more significant number of applications that will be deployed at - you guessed it - the edge to consume this data.

For this blog post I have used the following sources:

As always, stay curious!

©2020 by The Unlikely Techie. Proudly created with Wix.com