
EDGE COMPUTING
Edge computing is already in use, from the wearable on your wrist to the computers parsing intersection traffic flow. Smart utility grid analysis, oil rig safety monitoring, streaming video optimization, and drone-enabled agricultural management are some other examples.
These applications appear to be on the rise. Gartner estimates that fewer than 10% of enterprise-generated data is created and processed at the edge today, but that percentage will jump to 75% by 2025.
Edge computing can be difficult to explain to non-technical audiences, partly because this type of data processing can occur in various ways and a range of locations. At its most basic level, Edge computing is the technique of capturing, processing, and analyzing data close to its source.
- What is edge computing?
“For edge devices to be smart, they need to process the data they collect, share timely insights, and if applicable, take appropriate action. Edge computing is the science of having the edge devices do this without the need for the data to be transported to another server environment,” says Red Hat chief technology strategist, E.G., Nadhan. “Put another way, edge computing brings the data, and the compute closest to the point of interaction.”
“If we think about a hub-and-spoke model, the cloud is the hub and everything on the outside of the spokes is the edge.”
It’s “everything not in the cloud,” says Ryan Martin, principal analyst with ABI Research, “If we think about a hub-and-spoke model, the cloud is the hub, and everything on the outside of the spokes is the edge.” This decentralized approach enables organizations to move processes like analytics and decision-making closer to where the actual data is produced.
“Put simply, edge computing is data analysis that takes place on a device in real-time,” says Nima Negahban, CTO of Kinetica. “Edge computing is about processing data locally, and cloud computing is about processing data in a data center or public cloud.”