The Fragmenting Datacenter
For many organizations, the traditional datacenter is splintering to allow workloads to run in the place best suited for them, based on such variables as performance, cost, security, and management. Those processing locations can be the datacenter, a public cloud infrastructure, and, increasingly, small satellite datacenters that aggregate data processing from nearby devices. Some workloads simply run best directly within the network-connected machines we call Internet of Things (IoT) devices and sensors, and those devices might be anywhere in the world.
Welcome to the modern-day “edge.”
The primary goal of edge computing is to reduce latency in situations where instantaneous application response times can save an organization piles of money, while vastly improving the processes or decision-making, and even save lives.
IoT and Industrial IoT (IIoT) devices can be located in planes, driverless cars, agricultural fields, in underwater robots, on oil rigs, and a million other places. Many are hard at work collecting data that can be most valuable at the location where it’s created—not after it’s been hauled back over a long-distance network to a corporate datacenter or public cloud, then sliced, diced, and evaluated. In today’s world, there are classes of actionable data that age fast.
Stale data can have disastrous results. A matter of milliseconds delay can make a difference when a self-driving car needs to detect and avoid a pedestrian, so cars will continue to process the most important data on-board, locally on the device edge. Similarly when a surveillance system with facial recognition tries to identify a fleeing criminal, it may not be desirable to risk a network path traversing multiple networks into and out of the public cloud.
How The Edge Has Evolved
The edge is an IT concept that’s been around for decades. If you want to consider the big picture, you can think of the edge as the intelligent device (or collection of intelligent devices) closest to one of the following:
- The external routers in your datacenter (your edge)
- The transition point between communications networks (the network edge)
- A person using the device to get a computational result or consuming content from it (the device edge)
- An unmanned machine (IoT device) using local compute to derive a computational result (device edge)
Between Networks
Until recently, the term “edge” was used predominantly in the context of communications networks. In networking parlance, the edge signifies a device—usually a router (or, if you're going back a couple of decades, multiplexers, gateways, and other types of gear that may have attached to routers). This equipment linked local corporate networks to a wide-area network (WAN) and/or the Internet.
The WAN edge is and was physically one or more such devices in a datacenter, server room, or wiring closet, each with a LAN connection on one side and a WAN connection on the other. When mobile networks took off, of course, the network edge expanded to include anywhere there was a user with a smart mobile device and a network connection.
Between Users And Networks
In my time as CIO at Pandora, we focused on the edge concept from the perspective of the content provider who ran our own content-delivery network, or CDN. Asa music streaming and Internet radio service, we cared greatly about quality of service. No one wants to listen to music that’s choppy or has dead spaces while the content stream is buffering. CDNs, like those operated by Akamai, Cloudflare, Fastly, Netflix, and others, use a number of tricks to accelerate the delivery of music and other media to Internet-connected devices to avoid this type of performance degradation.
A primary technique they use is edge caching, which involves storing replicas of content in multiple servers placed strategically at the edges of the Internet so that they’re closer to users. Because physical distance incurs latency and latency is anathema to real-time content like streaming music feeds, being close to the network edge arose as an important solution for Pandora for shrinking the users’ physical distance to the source of our content. They helped boost performance and, in turn, the all-important customer experience.
Similarly, traditional communications service provider (CSP) networks have edges. Because these edges are critical to CSPs’ ability to deploy services with solid customer experiences, there’s an Open Networking Foundation project called Central Office Re-architected asa Datacenter (CORD), using virtualization and cloud-native technologies to help the CSP edge similarly deliver high performing end-user experiences.
Reducing latency and boosting performance may be increasingly important in IoT applications. The speed with which IoT applications can affect a person’s health, safety, or decision-making parallels what we accomplished with CDNs and what CSPs are now trying to do with their own networks.