Delivering digital services to the network edge: Key considerations and best practices
By Ulrich Schälling, FNT Solutions
Mobility and the Internet of Things (IoT) have driven the need to build networks connecting a broad range of devices with very different application requirements and connectivity characteristics. This is one of the factors driving providers to focus on the periphery of their networks and get closer to customers—at the edge.
Providers continually strive to provide the best customer experience possible. That means uninterrupted service and low latency. The problem, however, is that geographically distributed resources make delivering an acceptable level of quality difficult. Storage and computing processes are simply too far away from devices for data-heavy applications like streaming video, augmented reality, and artificial intelligence.
Edge computing addresses this problem by collecting and analyzing data locally to alleviate dependence on cloud and internet connectivity. This makes a huge difference in situations where information needs to be processed quickly. When compute, storage and network connectivity are all at the edge—either on the device itself or in a local gateway—and data is processed there, the barriers of distance and subpar connectivity are removed.
This makes edge data centers critical, and telecom providers can fill this need quite nicely by providing an alternative to hyperscale data centers, which centralize large blocks of infrastructure. Providers have highly distributed, united, smaller data centers that are closer to where the content is created.
Key factors driving telecom providers to the edge
Edge computing represents an important paradigm shift. Centralized cloud is not always the best fit for modern applications, hence the need for edge architecture. The Internet of Things plus the need for low latency along with economics all play a role in shaping edge solutions.
Internet of Things. From increased adoption of wearable technology to virtual learning, smart cars, home automation, and virtual assistants, the volume of data generated and accessed by mobile devices continues to grow. The equipment that communicates with these devices needs to live closer to users to be most effective. Therefore, compute-intensive and latency-sensitive applications are best hosted at the edge of the network. It makes sense to shift mobile compute power there as well.
Edge computing provides a highly distributed computing environment that can be used to deploy applications and services as well as to store and process content near mobile users. It eliminates the need to backhaul traffic generated by applications to a data center. By setting up data centers on the edge of their networks, service providers can deliver interactive experiences with real-time applications that they would not be able to provide otherwise.
Need for low latency. Latency, IoT and 5G are interconnected. 5G wireless networks allow for faster transfer of data to and from devices in the field to edge data centers. The rise of 5G coincides with the explosion of the connected devices and systems associated with IoT, in all its forms. The influx of additional latency-sensitive data caused by IoT needs to be processed in real time and is driving the need for edge computing.
Economics. A big driver of edge computing is coming from business customers in industries such as water utilities, power stations, oil and gas, pharmaceuticals, and factories. These industries are putting IoT devices in place so they can better monitor and manage what is going on in their infrastructure. Computing closer to the edge lets them analyze important data in near real-time. Business customers can own or rent space in edge micro-data centers, giving them direct access to a gateway into a telecom provider’s broader network, which could connect to other cloud providers. Telco operators can leverage their central offices to take advantage of this demand to expand their service offerings and diversify their sources of revenue.
On the other side of the profitability equation, edge computing has significant cost benefits for telecoms. It reduces backhaul traffic, which reduces costs dramatically. It further reduces costs by decomposing and disaggregating the access function and optimizing central office infrastructure. The lower-latency, higher-speed connection afforded by edge data centers drive data exchange delays and costs out of the connection.
Best practices for moving to the edge
There is a huge amount of data generated over the internet, and telecom providers are well-positioned to make it useful and actionable, if they focus on four key areas.
1. Infrastructure. For latency-sensitive and high-volume traffic, it is not optimal to take data back to the center, process it and send it back to the edge. It is much more useful to do the processing in local loops. This requires local infrastructure to be centrally managed, including all information about IT and telecommunication assets and connections, the central data center and edge data centers. This data should be housed in a single repository that dynamically updates as change occurs, so users across the organization are accessing the same accurate, up-to-date data at all times.
Edge computing, which meshes edge and core data center infrastructure, adds another level of complexity to infrastructure management, especially when managing a hybrid environment of network, IT, and data center elements. Delivering high-quality services and applications based on and produced by such an infrastructure requires visibility into all dependencies, across all resources, to ensure high-quality service that is still affordable. A unified resource repository that encompasses all physical, logical, and virtual resources of such a hybrid infrastructure lays the proper foundation to manage this challenge.
Such a central data repository provides complete visibility and transparency throughout the IT, telecommunications, and data center network infrastructure. It effectively eliminates the barriers that prevent the sharing of this critical information and enables a single, holistic view of the infrastructure. When this information is made actionable by a management system, the infrastructure management team is able to more-effectively operate, analyze, plan, implement, change, document and monitor all technology activity, irrespective of geographic location or corporate silo.
2. Data centers. It is more difficult to deliver a good user experience on the edge than in a central data center, where you have high-availability connectivity and power systems. Building out an edge network means changing the way you manage and run your data centers. Your systems are no longer in large, easy-to-access buildings with on-site operations teams. It is more like a cellular network, with hardware deployed in modular housings on remote sites.An edge network is more than just connections and cabling. It is a mesh of micro-data centers that process and store critical data locally and share received data with a central data center or cloud storage repository. For an edge deployment to succeed, it must be managed alongside your existing data center, with the same processes and the same level of care. With more of these smaller edge data centers scattered around, if you are not good at remotely monitoring equipment, resolving problems, and moving load from one IT asset to another, it will be difficult to manage the large amount of geographically dispersed IT assets.
Software tools can manage the distribution and configuration of virtualized resources for an application across the relevant edge and core data centers. But there will be many applications, and there will be many different supplier- or application-specific software platforms and tools managing the configuration of the virtualized layers. All these virtualized resources are ultimately based on, and dependent on, the same underlying server and storage capacity within the data centers and connectivity capacity in between.
To set up and manage edge data centers, a centralized solution to manage and optimize the entire data center infrastructure is needed—including the central data center and all edge data centers. The solution should support capacity planning and change management with a comprehensive and integrated view of data center resources, including building infrastructure (power, cooling, floorspace), IT infrastructure (network, servers, storage), and services (software, applications).
3. Connectivity. Service providers have the infrastructure in place to connect with IoT devices, other data centers in the edge network, regional facilities, as well as with the core data centers far away. In addition to providing compute and local storage, edge data center networks leverage their back-end transport infrastructure to backhaul the data from the edge to a centralized cloud server for user analytics and reporting.
Network connectivity is therefore vitally important, requiring rock-solid infrastructure to support edge data centers. Successful deployments include the use of multiple connectivity points and redundant connections between edge data centers and from edge data centers to core data centers, capable of supporting the traffic load. This is important because if there is a failure or a connection loss, the same high-quality service can be delivered. That may also mean mixing wired and wireless connectivity to ensure access even when one route is down.
To ensure reliable connectivity, telecoms will need to implement an infrastructure management solution that centrally manages all cable and telecommunications network and service resources, both inside and outside of the plant. The solution must be able to encompass passive infrastructure all the way up the stack to active inventory, across all technologies. Such a repository will provide processes and tools with accurate information, streamline operations, and increase service quality tremendously.
4. Virtualization. Edge computing enables IT, network function virtualization (NFV) and cloud-computing capabilities within the access network, in close proximity to subscribers. Service providers will need to leverage software-defined wide area networks (SD-WANs) and NFV software to deliver services. Edge computing fits nicely with 5G and SDN/NFV deployments, which run certain virtualized network functions in a distributed way, including at the edge of networks.
Not all virtual functions can be hosted centrally. While some things can be centralized (border/gateway functions between core network and public internet, virtual IMS, and virtual EPC), others make more sense to distribute. For example, virtual CPE (customer premises equipment) and CDN caches need to be close to the edge of the network, as do some 5G functions such as mobility management. No provider wants to transport millions of separate video streams to endpoints from one central facility.Because edge computing puts compute resources at the edge of telco networks, these servers can be used for distributing internal network functions. Edge computing extends virtualized infrastructure into the radio access network (RAN) and uses a lot of NFV infrastructure to create a small cloud at the edge.
To maximize the benefits of virtualization in your networks, look for a solution that can manage a hybrid network infrastructure comprising both traditional resources and virtualized ones. Introducing NFV-based solutions into a telecommunications network requires providers to centrally control and optimize the server and software infrastructure, the virtual network functions (VNFs), the NFV infrastructure, and their interaction with physical network functions and the network.
This requirement can be met by introducing a hybrid resource management solution that centrally manages, plans, and documents all relevant physical, logical, and virtual resources, capacities and assets across the telecommunications network, IT, and data center infrastructure—regardless of where they are located. Because the products and services that are ultimately delivered to customers are made up of both traditional network services and virtualized components, the seamless navigation throughout these different types of resources will be a crucial success factor.
Market opportunity for service providers
Telecom providers are bringing cloud capabilities to the edge of the network, removing latency, and reducing network and compute load back to centralized data centers. They must continue to invest in management and orchestration frameworks to be successful in this paradigm. Access to back-end IT infrastructure in the cloud will be pivotal. Creating the network fabric that leverages that back-end infrastructure will be a differentiating factor.
Providers must place more intelligence at the edge of the network using gateways that host applications and provide access to local storage as close as possible to the point of application consumption by the end user. Those applications in turn can then leverage back-end cloud resources to access data as needed. To ensure high quality of service at affordable cost, providers must ensure transparency and visibility across these hybrid infrastructures to efficiently manage and keep pace with increased complexity. Telco services are becoming inherently edge-oriented, and they will depend on virtualization for distributed capabilities so they can live on the edge of networks.
This is an exciting time for service providers.
Ulrich Schälling is head of business line networks with FNT Solutions.