In the traditional data center environment, most data processing was handled locally. Then came the cloud with its centralized data processing architecture. The cloud offers greater flexibility and lower costs than distributed computing models, but comes with the tradeoff of higher latency. Data must travel long, complex paths between Internet nodes and gateways, which impacts application performance.
These bottlenecks became noticeable with the advent of the Internet of Things (IoT). Internet-connected devices and sensors provide organizations with valuable data that can help increase operational efficiency, improve customer service, and drive the development of new products, services and business models. However, IoT data needs to be analyzed in near real time in order to generate actionable insights. This is difficult when the data must travel to a central data center or cloud environment for processing.
The IoT has given rise to the concept of edge computing, which moves processing power to the edge of the network where it is closer to the data source. Edge computing involves the use of smaller compute platforms near IoT devices, or within the devices themselves, to reduce the distance data must travel prior to processing and analysis. This approach increases performance and throughput, conserves bandwidth, ensures greater data quality and reliability, and overcomes weak network connections in remote locations.
But while the IoT has been a primary driver of edge computing adoption, it is far from the only use case. In fact, the State of the Edge 2018 Report from Edge Research Group and Structure Research emphasizes the end-user perspective when discussing the trends that are having the greatest impact on edge computing today. End-users accessing applications and data from a centralized cloud data center experience the same latency issues as IoT devices, and can benefit from an edge computing model.
The report defines three layers of the edge computing architecture:
- Infrastructure edge: the service provider side of the last-mile network
- Access edge: the portion of the infrastructure edge closest to the end-user
- Aggregation edge: the portion of the infrastructure edge that serves as an aggregation point for IT resources on the access edge
The aggregation edge can be thought of in terms of a content distribution network that caches data at various aggregation points and handles certain end-user requests at those points rather than sending them to a centralized data center. The aggregation edge goes even further by enabling organizations to run specific workloads on edge platforms.
Software-as-a-Service (SaaS) providers are already starting to take a distributed network approach to the delivery of cloud applications. Microservices are being deployed across multiple points of presence to bring services closer to end-users. Application traffic is first sent to an edge platform rather than the cloud data center.
As the volume of data sent to and from applications continues to increase, organizations should evaluate edge computing architectures for their in-house and private cloud workloads. Software development frameworks are evolving to allow for more data processing at the aggregation edge. This can create a better user experience and allow for real-time data analytics.
Edge computing also reduces security risks because data flow between devices and the central data center is minimized. Maintaining data at the network edge can also facilitate compliance with industry and government regulations.
Technologent has extensive IT infrastructure expertise and can help you architect and manage an edge computing environment. Let us show you how bringing data processing to the network edge enables you to improve business operations and create competitive advantages.
July 12, 2018