In recent years, the case of edge computing vs cloud computing has been a major buzzword in the tech world. Many believe that edge computing is the next big thing, and will soon replace cloud computing as the main computing philosophy.
However, it’s important to first understand that both are very different from each other, and contrary to the popular misconception, edge computing won’t replace cloud computing, but rather will coexist in achieving a better technology ecosystem.
Before we can really understand the difference between the two concepts, and how we can implement them, we’ll need to dig deeper at the two computing philosophies:
What Is Edge Computing
Edge computing is a networking philosophy that especially utilize an edge server (more on this further later) to exist at the “edges” of the network.
Still too complicated? How about this: edge computing focuses on bringing the computing process as close to the device—the data source—as possible, “edging” to the device. The idea is, when the computing is closer to the data source, there will be less latency and bandwidth usage.
So, the basic principle of edge computing itself is to actually avoid running too many processes in cloud computing, and move as many of these processes to local devices: your computer, your smartphone, an edge server, an IoT device, among others.
An example of edge computing is automated vehicles. Autonomous driving technology process and compute most of the data within the vehicle, and then only utilize remote network servers when necessary. For instance, when downloading GPS data, the vehicle access the cloud server, but when processing data from sensors, it’s done within the local system. This is edge computing.
What Is Cloud Computing
We are probably more familiar with the term “cloud computing” because it’s relatively older and has peaked in popularity for the next couple of years or so.
In a nutshell, cloud computing simply means storing and accessing data and even applications/programs over the internet, by utilizing a network of remote servers rather than using a local server.
For example, when we are using Instagram, all the necessary computing processes are done on Instagram’s server, and we only download the final product. This is cloud computing.
Edge Computing VS Cloud Computing: The Core Differences
Now that we’ve discussed the main concepts between the two computing infrastructures, we can discuss their differences.
While as we have mentioned, they are very different from each other, there are two main differences that stood out:
Where The Computing Happens
In Cloud computing, most of the computations and data processing happen on remote servers accessed through cloud services. For example, when we access iCloud, every data processing happens on Apple’s remote servers.
In Edge computing, on the other hand, data processing happens on the device itself, or can be processed within an edge server that is placed between the device (your computer, a smart thermostat, your smartphone, smart doorbell, etc.) and the cloud servers.
The Advantage and Purpose
The main reason of using cloud computing are more storage and more processing power. By using the speed of the internet, we can “borrow” the huge storage space of Google, for example, or the processing power of Amazon EC2.
On the other hand, the main benefit of edge computing is nearness and reducing latency. For example, in an automated vehicle, we wouldn’t need to worry about having no network signal since everything is processed within the vehicle. Also, when we need to scale the system, we only need to upgrade our device and wouldn’t need to rely on upgrading our cloud resources (which can be very expensive).
Edge Computing FAQ
1. What Actually Is the “Edge”?
The term “edge” in cloud computing, or more accurately “network edge” refers to a location —physical—, where a local network consisting of a device, or the device itself, connects to the internet.
For example, let’s say we have a smart doorbell—an IoT device—, and within it we have a processor or a small computer. This processor can be considered the “edge”. Similarly, in an automated vehicle, the computer within the car can be considered an “edge”.
The edge of the network must be physically and geographically close to the device, as opposed to cloud servers that can be very far from the device.
2. What’s an Edge Server?
Simply put, an Edge server is a computer or server that exists at the “edge” of a local network. The edge server’s purpose is to connect two separate networks, and act as a data storage that is located as close as possible—geographically— to the client device, so it can reduce latency and improve loading times significantly.
Using the smart doorbell example above, the Wi-Fi router within the house can be considered as an “edge server”, and the homeowner’s ISP provider is also an edge server.
In any particular network, several different devices connect to each other using one or more predefined configurations. The whole network might want to connect to another network or the internet in general. In this case, we need a “bridge” to connect the traffic from the original network to another. Any devices that form this bridge on the network’s edge are “edge servers”.
3.What Are The Advantages of Edge Computing
As we have mentioned above, the key advantage of edge computing is reducing latency by maintaining close physicality of the device and the “edge”, minimizing bandwidth utilization and server resources.
Everytime a device needs to communicate with a server, there will be a delay. The further the server is, the larger the delay will be. For example, let’s say there are two people using Whatsapp to chat with each other within the same building. Everytime a message is sent, the signal must travel to Whatsapp’s cloud server somewhere around the globe, and then travel the same distance before the recipient can view the message. If, instead, there is an edge server that can manage the process within the building, we can eliminate a significant delay.
Delay—or latency— can be caused by many different factors from the available bandwidth, how far the server is located, and so on. However, the closer we can bring more processes to the network’s edge, we can avoid more delays.
The limited amount of cloud resources and bandwidth is also a reason to maximize edge computing. It is predicted that we will get 18 billion of IoT devices connected by 2022, and accommodating all these devices with cloud resources will be very difficult and very expensive. This is where edge computations will become a necessity.
To summarize, Here is the key benefits of edge computing:
- Significantly reduce latency
- Lowering bandwidth and server resources—and the associated costs—
- Added functionality for the connected device (i.e. enabling an automated vehicle to process data within the car)
4.Are There Any Concerns Surrounding Edge Computing?
One of the main concerns of edge computing is exposure to hacking and data breach. More devices in the system (with the addition of the edge computer or edge server), will translate to an increase in attack vectors—more opportunities for hackers to attack the different devices—.
Another concern—or rather, disadvantage— of edge computing is the increased requirement of local hardware—. For example, in an automated vehicle, we’ll need to add a tablet or even a computer within the car, which can be quite expensive. Using edge servers can help mitigate this need to add local hardware, maintaining the proximity of the local connection while allowing the system to utilize the processing power of remote devices.
While it’s true that the major surge in IoT and the increased overload of cloud resources requirement will require us to slowly move to edge computing, cloud computing is here to stay.
Both edge computing and cloud computing offer different benefits and key purposes, and they should work hand in hand in providing faster, more reliable data processing and better user experience for everyone.