Cloud

What is Edge Computing? Definition and Cases Explained

With the proliferation of 5G communications technology, edge computing—the practice of performing computing tasks as physically or logically close as possible to where data is created and commands are executed—has begun to permeate the modern world of smart technology. In this article, we explore the concept of edge computing in detail, and explain how it offers many excellent advantages, especially in terms of latency reduction for applications that rely on real-time decision-making.
What is Edge Computing?
Essentially, edge computing refers to the concept of computing as close as possible to where data is created and commands are executed. This form of computing could potentially be on the client device itself – such as a smartphone, surveillance camera, drone, or autonomous vehicle – or it may be performed a few hops away, such as on a locally connected server next to a cell tower or in a small, local data center. The important takeaway is that edge computing is done as geographically or logically close to the source of the data as possible, in order to reduce network traffic and latency. 《Glossary: What is Data Center?

This is done when 5G communications and cloud computing may already be ubiquitous, because the time it takes to send a request to the central server and get a response may still be too long, especially for time-critical tasks, such as steering an autonomous vehicle. A suitable edge device, whether it is an edge server or a powerful client device, will be able to circumvent the problems of latency, bandwidth, and response time. In circumstances where privacy or security can be an issue, edge computing may also be preferable. Modern examples of the next wave of smart technology, such as artificial intelligence and the Internet of Things (IoT), rely heavily on the deployment of edge devices.

Glossary:
What is 5G?
What is AI?
What is IoT?

This trend is expected to continue for quite some time. According to the market research firm IDC, by 2023, more than half of newly established enterprise IT infrastructures will be deployed at the edge rather than in central data centers. The market research firm Gartner forecasts that up to 75% of enterprise-generated data will be created and processed at the edge of the network by 2025.

Edge computing is the future; however, before we ruminate on what it portends, it helps to take a step back and look at how we got here, so we can get a grasp on the tools we will need to carry this momentum forward.
It Was Acceptable in the 80s – Edge Computing Has Come Full Circle
The basic concept of edge computing is not new. Although networking and computer technology were first introduced in the 1960s, in the past, it relied heavily on mainframes – huge monolithic computing systems placed in an office or laboratory –that are hooked up to multiple “dumb” terminals through local or wide area networks (LAN or WAN) so users could gain access. In the age of mainframes, computing was basically performed in a central location—an early incarnation of the modern data center.《Glossary: What is LAN?

However, the 1980s introduced PCs (Personal Computers), which miniaturized the computer and brought it into our homes and offices. Computing power was now placed at the “edge” – the physical location where data was being collected and processed. This was a shift from a centralized to a distributed form of computing.

By the early 2000s, another shift had taken place. With the rise of the Internet and related networking technologies, which enabled higher connection speeds and bandwidth limits, it was possible to transmit more data to a remote location for computing or storage much more quickly. The return to centralizing computing allowed for more efficiency, and it was accessible from any place that could connect to the public Internet. Amazon first introduced their EC2 (“Elastic Computing Cloud”) service in 2006. Since then, more and more computing tasks have been performed remotely, creating a demand for technologies such as SaaS (Software as a Service) and PaaS (Platform as a Service). A simple example everyone is familiar with is Google’s Gmail and Drive applications, which host and store email and files on the cloud, and can be accessed from virtually anywhere with a web browser. Many businesses, ranging from small startups to huge corporations, now run some or all of their computing workloads—such as ERP and accounting systems, web services and applications—with public cloud services like Amazon AWS, Microsoft Azure, or Google Cloud. This has become known as the era of Cloud Computing, which heralded a shift back to centralized computing.

Learn More:
《Glossary: What is Cloud Computing?
What is a Server? A Tech Guide by GIGABYTE
Using GIGABYTE, NIPA Cloud Soars Among CSP Giants in Thailand
Why is Edge Computing Important?
You may wonder: if there is a move back to centralized computing, why is edge computing the next big thing?

The fact of the matter is, even as computing resources are being consolidated once again, people are discovering the limitations of this system. For one, bandwidth availability and costs may form an insurmountable hurdle as more and more devices are linked to the IoT. For another, the farther the data center is from the data, the greater the latency will be, making real-time responses unrealistic. Obviously, new inventions (such as self-driving cars) cannot be implemented before this issue is resolved.

Edge computing offers a solution to these problems. An edge device acts as a local source of processing and storage for connected client devices; in effect, it is like a miniature local data center. You can think of it as a local magistrate who can take care of a majority of the people’s supplications, deferring to the central authority only in special cases. A network of edge devices greatly reduces the bandwidth costs and increases availability in the era of cloud computing. Operators can also save money by reducing the amount of data that actually needs to be sent back to the central data center for processing.

If you are confused about how this is different from the previous era of distributed computing, the answer is this: in the end, the mobile devices we are all so fond of have limitations. Dedicated edge devices offer much better processing power than individual client devices. By completing the bulk of the processing work on edge devices that are much closer to the data, latency issues are virtually nonexistent. Indeed, edge computing can make state-of-the-art processing and smart real-time decision-making available in just about every area. The proliferation of edge computing will make groundbreaking smart solutions, such as autonomous vehicles, a reality in the not-too-distant future.

Learn More:
《Glossary: What is ADAS?
Constructing the Brain of a Self-Driving Car with GIGABYTE
When Every Millisecond Counts, Latency is Non-negotiable
Let’s look at an interesting example that’s already part of our daily lives: computer vision, a kind of AI used in video surveillance and facial recognition systems (such as face ID building access systems, vehicle license plate recognition systems in parking lots, or even city-wide CCTV systems used by law enforcement). In order to recognize a human face, including gender or age, and match it to a database of existing records (such as employee photos or a police database), the system will rely on a machine learning model. Although it is generated and tested before the system is deployed, the computer vision system also needs to run this model every time a human face is recorded, to perform real-time matching and recognition. This is called inferencing, and it requires a certain amount of computer processing power. It is easy to run this kind of workload in the cloud, but the delay of sending it back and forth across the network will be noticeable, especially if you need to stand for an extra second in front of the facial recognition terminal. And that’s just for one camera – if there are tens or even hundreds of cameras in a large building or even a whole city, the data of thousands of people will need to be sent back and forth across the network, leading to either a longer delay due to high latency, or extremely high network connection costs as more and more bandwidth is required for shooting terabytes of data across the network.

Glossary:
What is Computer Vision?
What is Machine Learning?

That’s where edge computing is more beneficial – these workloads can be processed locally to minimize both latency and network bandwidth usage, either on a small device connected to the video camera directly, or on an edge server located on a local area network. After the data is processed, results can be delivered back to the device or application for action, and only relevant data will be sent back through the cloud, reducing bandwidth needs.
Education, Entertainment, and More: Edge Computing Put in Practice
Besides security, facial recognition applications that have been made possible with edge computing can be used in a variety of other sectors—the sky is the limit. Such was the kind of outside-the-box thinking that went into the creation of the AI Facial Recognition Solution for Taiwan’s school libraries: a nifty combination of edge computing and AI tech that has already been implemented in New Taipei grade schools like Beixin Elementary and Xindian Elementary.

Inside the libraries of these schools, a facial recognition device was set up to scan students’ faces as they checked out books. A computer or server (in the case of Beixin Elementary, an R-Series Rack Server from GIGABYTE) is installed in the school to serve as the edge device. Edge computing fulfills three important roles in these scenarios: one, it protects the database where information is stored; two, it is home to the AI that continuously improves the solution’s speed and accuracy; and three, it hosts the web server that runs the actual facial recognition software. This application of edge computing makes sure using your face to check out library books is safe and easy, and it is an ingeniously organic way to infuse a traditional learning environment with new technology.

Learn More:
《More information about GIGABYTE's Rack Server
Facial Recognition Brings AI Education and Edge Computing to Taiwan's Schools

Another vertical market where edge computing is making a difference is entertainment. Case in point: the newly opened Taipei Music Center wanted to immerse its guests in the second-floor VIP room with an extravagant audio and visual experience. Simply by putting on wireless head mounted displays (HMD), visitors can get a 360-degree view of any ongoing live concert in 8K resolution. The “VR 360 stadium experience” could be achieved thanks 5G tech and a built-in micro data center to support edge computing.

This is how it works. Data from a multitude of cameras and recording devices in the concert hall are streamed to the GIGABYTE G481-HA0, a 4U G-Series GPU Server with slots for up to ten GPGPU cards. The server processes data in real time to recreate the concert in virtual reality. By storing and routing the footage locally instead of through a faraway cloud computing center, latency caused by network backhaul is eliminated. Viewers can enjoy a true feast for the senses thanks to edge computing.

Learn More:
《More information about GIGABYTE's GPU Server
《Glossary: What is GPGPU?

Sometimes, enjoyment may require some necessary precautions, similar to how roller coasters have seatbelts. When the Pokémon GO craze took Taiwan by storm and masses of digital critter-hunting players took to the streets, ITRI, the Industrial Technology Research Institute, and the New Taipei City Police Department were ready. They designed and deployed a “Private Cell Mobile Command Vehicle” that offered high bandwidth, low latency, and ultra-reliable communications for police officers on the job. Officers were able to monitor high-resolution video footage in real time from surveillance cameras, body-worn cameras, and aerial drones. The command vehicle also used an AI-enhanced computer vision system to identify crowded hot spots.

GIGABYTE H281-PE0 was at the heart of the command vehicle. It served as the edge device that enabled low-latency transmissions and employed Network Functions Virtualization (NFV) technology to deliver cellular core network services. The H281-PE0 was a big step forward in the field of 5G-ready edge devices because the compact 2U form factor took up less space without sacrificing computing power or component density. The experience GIGABYTE gained from the deployment of this edge computing solution eventually paved the way for the birth of GIGABYTE’s E-Series Edge Servers.

Learn More:
《More information about GIGABYTE's High Density Server
《More information about GIGABYTE's Edge Server
Edge computing is adopted for computing situations where any delay is not acceptable.
Edge computing is ideal for facial recognition systems.
Security and Privacy at the Edge
Apart from the issue of latency and being closer to where data is created and processed, another reason edge computing is preferable in many situations is due to data security and privacy concerns. Take for example the smart home, which will often feature an online voice assistant device such as an Amazon Echo. This kind of device will monitor sounds within your home, and like the previous example, also use a machine learning model to detect voice commands, and match these commands to particular actions (such as turning on the lights or performing an internet search for tomorrow’s weather). It’s unlikely the house owner would like these voice commands, and all the other private audio data recorded in their house, to be sent over the internet to a remote location to perform matching and recognition, where it could potentially be hacked or exposed, or even sold to other companies. Therefore, it is preferable to have the processing of these voice commands done on the edge device, and then deleted when the command is executed. 《Learn More: AIoT Application-「Do You Know About AIoT? The Practical Applications of Combining Artificial Intelligence with IoT」》
Edge computing is ideal where data privacy is paramount, such as for smart home devices.
However, distributing and decentralizing computing also has a potential downside for security, since each different location on a network represents a vulnerability that could be exploited by hackers. While the centralized nature of cloud computing allows these computing devices to be managed and protected better and threats to be detected more easily, edge computing is more challenging, with more types of devices in remote locations that could have less robust security protections. Therefore, any company implementing an edge computing strategy must also take security as a serious consideration, ensuring that all devices on the network are maintained and updated in a unified manager for new security patches, and feature robust protection measures such as data encryption and firewalls.
Edge Computing and 5G – Perfect Partners
Edge Computing will also be broadly adopted in the next generation of 5G cellular networks. A network architecture known as MEC (Multi-access Edge Computing / Mobile Edge Computing) can enable cloud computing capabilities and an IT service environment to be placed at the edge of a cellular network, such as at cellular base stations or other RAN (Radio Access Network) edge nodes. This can allow the many different applications that the high bandwidth and high transmission speed capabilities of 5G to be processed as close as possible to the user at the periphery of the cellular network, in order to meet the strict latency and reliability requirements of 5G while also helping network operators to reduce their network backhaul costs. 

Glossary:
What is MEC(Multi-access Edge Computing)?
What is RAN?

For example, autonomous drones deployed for package delivery, bridge inspection or crop dusting will be enabled by combing together 5G wireless radio communications technology and edge computing. As an aerial vehicle, the drone needs both a method of low latency and highly reliable wireless communications to send and receive large volumes of data, as well as artificial intelligence capabilities (with machine learning technology) to make independent decisions in real time from data collected both by itself from the surrounding environment, and the remote systems and applications that are managing it. However, placing sufficient computing power to run these machine learning models onboard the drone itself will cause it to be heavier, reducing the battery capacity and flying time. Therefore, some or all of the computing workload can be instead made on an edge server and immediately relayed back to the drone using a type of 5G service category known as URLLC (Ultra-Reliable and Low Latency Communications), ensuring the drone can make decisions immediately without any latency or potential for drop-out which could be catastrophic for the operation of an aerial vehicle.

Learn More:
eMBB Solution:《An Immersive VR Stadium Experience with 5G eMBB Technology
mMTC Solution:《A Smart City Solution with 5G mMTC Technology
URLLC Solution: 《An Autonomous Vehicles Network with 5G URLLC Technology
Autonomous delivery drones will depend on edge computing technology.
Hardware Built for the Edge – Small, Efficient, Flexible
Now that we know the necessity of edge computing, how will we go about implementing it? Computing infrastructure such as servers built for cloud computing are large and power hungry, designed to deliver as much performance as possible, and usually require a cold air-conditioned, dust free environment. Since they will also be deployed in a huge number, they are also usually optimized only for a single purpose – such as storage, or CPU computing, or GPU acceleration, and are designed to be deployed in huge cloud data centers where space or power is more readily available. 《Glossary: What is GPU?

On the other hand, edge computing needs to be performed close to the user location in downtown or urban areas, such as in an office cabinet or the base of a cell phone tower. Therefore, space is restricted and power supply might be limited or expensive. In addition, there might not be air conditioning available to maintain a perfectly cooled environment, and since the server will usually be deployed just as a single unit, it needs to offer a good balance of compute, storage, networking, expansion and GPU support.
GIGABYTE has the Solution – Edge Servers for Every Situation
GIGABYTE has also begun to offer our customers a new range of servers specifically designed for edge computing, such as our H242 Series 2U 4 Node edge server. They are specifically designed for edge computing applications such as a MEC (Multi-access Edge Computing / Mobile Edge Computing) to build 5G networks, featuring a compact form factor (short depth & height) and lower power consumption requirements, while still offering capable computing performing (with AMD EPYC or Intel Xeon processors) to run demanding virtualized workloads at the edge.

GIGABYTE’s edge server systems also feature a good balance of memory capacity, storage and other expansion capacity (including PCIe Gen 4.0 support to utilize the latest high speed networking technologies), and even accelerator card support (such as for NVIDIA’s T4 GPGPU) to run inferencing workloads such as computer vision and or speech recognition models, to support AI-enabled applications and services. 

Learn More:
《Recommend for you: High Density Server H242-Z10 & H242-Z11
《Glossary: What is PCIe?
GIGABYTE’s H242 Series multi-node server for edge computing.
Conclusion
Although it is already in use today, the benefits of edge computing will play an even more important role in enabling revolutionary new technologies on the near horizon. The time when you will be able to effortlessly stream an 8K video on your mobile phone, or step into an autonomous taxi for your ride home, is no longer on the edge of our imagination – in a few years it will be a certain reality. And that’s thanks to the technology of edge computing, made possible by GIGABYTE and our industry partners.
Get the inside scoop on the latest tech trends, subscribe today!
Get Updates
Get the inside scoop on the latest tech trends, subscribe today!
Get Updates