why use edge computing - what is edge computing good for

Edge Computing Use Cases

  |  

0 Comments k8s Kubernetes

Ever wonder how your Netflix movie streams so smoothly or why your Google Maps responds instantly? That’s edge computing at work.

Edge computing brings data processing and storage closer to the source, speeding up response times and enhancing user experiences. And edge computing can apply to many industries, not just the obvious ones such as streaming or consumer-facing services.

With the advent of significant interest in AI, running applications closer to users or IoT devices is even more important, since AI requires significant data to build and leverage models. To stay ahead, most organizations will need to use edge computing in the future.

In this deep dive, we’ll explore real-world applications of edge computing—from boosting autonomous vehicles’ brains to empowering multi-camera video analytics—and show how 5G technology supercharges its capabilities.


The Concept of Edge Computing

So what exactly is edge computing? Let’s dig into it. According to Gartner, the idea is to bring computation and data storage closer to where they are needed. This means placing them on devices or local servers, rather than relying on a central location that is far away. This shift in the distributed computing topology aims to improve response times and save bandwidth.

With traditional a pure-cloud model, when you hit ‘save’ on that big report you’ve been working on all day, that information needs to travel miles (sometimes hundreds or even thousands of miles with many network hops) over networks before it reaches its destination for processing or storage to a centralized data center.

In contrast, edge computing allows having infrastructure much closer that can handle those tasks without needing such long-distance travel. And here we start seeing some real advantages:

Major Advantages of Edge

  •  Faster responses due to reduced latency – crucial for applications like autonomous driving or online gaming where milliseconds matter.
  •  Data privacy can be better protected as sensitive info doesn't need long trips across various networks.
  •  A reduction in network traffic which leads to less congestion & overall smoother operations.

The Transformative Power of Edge Computing

You might wonder why this change matters so much now? Well one major factor is IoT: Internet Of Things. The number of connected devices is skyrocketing. Statista projects it to reach 30 billion by 2025, a fourfold increase from 2017.

All these devices—from your smart thermostat at home to sensors in industrial machinery—are constantly generating data that needs processing. And this isn’t just any data—it’s often real-time and requires immediate action to be useful for the given application.

The Relationship Between Edge Computing and Real-Time Applications

Real-time applications are fueling the growth of edge computing. Why? It’s simple - they demand rapid processing speeds that traditional cloud-based systems can’t always provide. The Role of Real-Time Applications in Advancing Edge Computing

Imagine you’re driving an autonomous vehicle (AV). Your AV must process vast amounts of data on-the-fly to ensure a safe journey.

That includes everything from recognizing stop signs to avoiding pedestrians and much more. However, this needs to happen in a split second. A delay as short as one second could be catastrophic.

Edge computing allows for real-time analysis at the source of data creation, such as your AV. This enables lightning-fast decisions without significant latency issues.

According to Gartner, enterprises will process 75% of the data they generate outside centralized data centers by 2025. This is mainly since real-time applications require low-latency solutions.

Some examples:

  • Camera video analytics & AI visual processing: Most security cameras are never even looked at. In fact 15% of businesses never even look at their security cameras. AI is fixing this already - by being able to actively monitor footage live and alert users1. Video analytics is being helped by AI or AI related technologies in other industries: manufacturing performance, safety conditions, object and person detection. Video inherently requires edge computing due to its significant data requirements and real-time use cases.

  • Agricultural IoT devices: On farms around the world, smart sensors monitor soil conditions, weather patterns and crop health constantly sending out streams of information. Using localized processing capabilities offered by edge computing helps farmers make informed decisions in real-time, optimizing yields and reducing wastage.

  • Smart manufacturing: Factories equipped with IoT sensors generate massive amounts of data. Edge computing allows this data to be processed on-site, enabling quicker decision-making for things like equipment maintenance or quality control.

All these applications require immediate action based on the analysis of incoming data streams. This is where edge computing shines by providing low-latency processing capabilities right at the source of information generation.

The benefits don’t stop there though. By only sending relevant extracted insights over networks instead of raw voluminous data, you save bandwidth costs as well.

Edge computing is transforming how we process data, letting us make quicker decisions and cut down on latency problems. It’s so powerful that experts predict three-quarters of all enterprise-generated data will be processed in this way within just a few years.

The Synergy Between 5G and Edge Computing

When we talk about a perfect match, few can rival the synergy between 5G and edge computing. This partnership is not just turning heads in tech circles but revolutionizing how data is processed, transmitted, and stored.

How 5G Enhances Edge Computing Capabilities

If you think of edge computing as an efficient postal service that brings processing power closer to devices for faster delivery (of data), then imagine what happens when it gets a speed boost. That’s exactly what 5G, with its high-speed connectivity and lower latency does.

A key stat worth noting here: With 10 times less latency than its predecessor, 5G lets your applications react more swiftly - like having coffee ready before you’ve even finished yawning.

This enhanced capability doesn’t just make things quicker; it transforms use cases across industries. With 5G’s improved latency, autonomous vehicles can make more accurate decisions on the fly. The same goes for other time-sensitive applications such as drone control or remote surgery where every millisecond counts.

Addressing Network Latency with 5G and Edge Computing

Nobody likes waiting – especially not your apps. Delays in data transfer slow down application response times leading to laggy user experiences which are bad news if you’re trying to keep users engaged or if critical operations depend on them.

With edge computing we are able to transform everything from self-driving cars to remote medical procedures. It’s not just about doing things faster; it’s about unlocking potential in entirely new areas. 5G and edge computing are setting a new standard for the digital age.

In comes our dynamic duo: By moving computation tasks closer to the source of generation through edge nodes, coupled with fast transmission speeds offered by 5G, network latency gets a one-two punch. This means your applications don’t just run faster; they feel quicker to users too.

The numbers speak for themselves: Edge computing can reduce response times by over 30%, and when combined with the power of 5G, we’re looking at almost LAN-level response times in many applications.

The Role of Edge Gateways in Edge Computing

Edge gateways play a crucial role in edge computing. They’re like the maestros of an orchestra, coordinating all elements to create harmonious data flow from devices at the edge to central servers.

Efficient Data Processing and Transmission with Edge Gateways

Think of your favorite restaurant. You wouldn’t drive miles away for each ingredient needed for your meal, right? Similarly, edge gateways bring resources closer to where they are needed most: the devices generating data on-the-fly. This local sourcing helps reduce latency and allows real-time processing.

In fact, according to Gartner’s research, 75% of enterprise-generated data will be processed outside traditional centralized cloud-based locations by 2025 due mainly because these gateways make it possible.2

An excellent example is autonomous vehicles that rely heavily on quick decision-making based on sensor-derived information such as lidar or radar inputs. A delay could mean catastrophe; hence efficient transmission becomes critical.

Besides being faster than sending every byte back home (the core network), edge gateway also brings another benefit - bandwidth optimization. By processing some data locally before transmitting only what’s necessary upstream, it reduces overall network traffic significantly.

Making Devices Smarter with Local Processing Power

To better understand how this works let’s think about cooking again. Imagine you’re making a delicious lasagna but need help chopping vegetables while managing multiple steps simultaneously (sautéing onions anyone?). What if you had an assistant chef who could help you prepare the veggies? Suddenly, cooking becomes more manageable and efficient.

In this analogy, your edge devices are chefs trying to juggle multiple tasks at once. Edge gateways provide that extra pair of hands (or processing power) right there in the kitchen. This offloading is particularly beneficial for resource-constrained IoT devices like sensors or actuators.

It’s not just about handling current workloads either; edge computing also paves the way for newer applications needing low latency or high bandwidth - from AI-driven industrial automation to immersive AR/VR experiences.

Physical Architecture of Edge Computing

Edge computing is an innovative approach to handling data, and its physical architecture plays a crucial role. This design brings computation closer to the source of data generation - edge devices like IoT sensors or smartphones.

The Client-Edge Module Connection

To understand the client-edge module connection, imagine a bustling city. Your smartphone (the client) wants to find the quickest route home during rush hour traffic. Rather than sending your request miles away to a centralized cloud server for processing, it instead reaches out to nearby edge modules—akin to local traffic control centers.

This proximity enables quick response times and reduces latency—the delay before data transfer begins following an instruction for its transfer—an essential feature in time-sensitive applications such as autonomous vehicles or remote surgery (Nature). It’s akin to asking your neighbor about current road conditions rather than waiting on a far-off friend’s text message.

Apart from speedy responses, this setup also saves bandwidth by doing most of the heavy computational work locally at these edge nodes. For instance, when streaming videos on platforms like Netflix or YouTube (Data Center Knowledge) you’re actually connecting with their distributed servers placed close-by rather than some central hub located halfway across the globe.

Understanding Micro Data Centers in Edge Computing

In our busy city analogy, micro-data centers (often termed “MDC”) are miniaturized versions of traditional large-scale facilities that serve as localized hubs within each neighborhood—they’re essentially decentralized “cloudlets”. These are responsible for managing edge modules and performing the majority of computational tasks.

They work as a localized form of cloud computing, helping to decrease latency and enhance data processing speed. This allows devices at the “edge” to perform their functions more efficiently. Think of them as community centers—each handling its neighborhood’s needs without burdening city hall.

Deployment Options for Edge Computing

Edge applications can be deployed in a lot of ways. Let’s cover some of the more common examples:

Traditional firmware / OS deployment

Tried and true, and well understood by any IT team is simply managing edge deployments using the many tools you would use to managed remote PCs, servers, or embedded devices. This might be via SSH and a firmware upgrade through scp or through the various vendor-specific tools available from popular distributions such as Fedora, RHEL, or Ubuntu.

Upside: Everyone in IT understands these methods already.

Downside: If you thought managing hundreds of servers or clusters was hard, try managing 1000s of edge machines… in 1000s of locations. These solutions just don’t scale.

Traditional OS management + Docker

There are quite a few vendors who have container-based solutions combined with a traditional OS update. This solution allows you to containerize applications, providing better security and easy packaging, while not having to dive into more sophisticated container orchestration.

Upside: Provides better security, and software updates are easier than traditional methods.

Downside: Does not provide any concept of “fleet management” - where you can deal with many edge devices at once. Nor does it allow the easy interconnection of multiple containers.

Orchestration at the Edge

A much more modern technique is using container orchestration (or “edge orchestration”), namely via Kubernetes, to manage edge machines across a wide geography. Besides providing the advantages. This provides features beyond containerization including: connecting containers together, to create more complex applications, managing secrets needed to access services, and providing improved security between applications.

Upside: Provides the best manageability and security at the edge. And it also leverages existing cloud and dev ops teams which understand and have tools to use orchestration.

Downside: It’s more complex, and requires a learning curve or a team which already understands container orchestration

Learn more about Izuma Edge-as-a-Service