Skip to main content
Discover the latest edge computing trends and technologies at ONE Summit, April 29-May 1 in San Jose | REGISTER

Four Killer Services for the Wireless Edge

By October 4, 2018October 5th, 2020Blog
Silhouette Man Edge Icons

As telco operators look to harden their business models around edge investment, the conversation always comes back to use cases. Joseph Noronha of Detecon serves up what he sees as four killer services for the wireless edge.

Editor’s Note: This is a guest post from an industry expert. The State of the Edge blog welcomes diverse opinions from industry practitioners, analysts, and researchers, highlighting thought leadership in all areas of edge computing and adjacent technologies. If you’d like to propose an article, please see our Submission Guidelines.

In my previous post, I argued that the infrastructure edge, owned and operated by the telecom carriers, has become the critical bridge between the wired and wireless worlds. Telco operators have unique assets and are one of the best positions to catalyze the next generation of the internet apps by deploying what I call “killer services.” These killer services could create new revenue streams for telco operators and accelerate the work of edge-native developers.

When I discuss edge computing among my peers, it quickly comes down to the “use case” question. Every operator would like to see concrete and validated use cases before doubling-down on their commitment to edge computing. While most operators today see the inevitability of edge computing, they don’t fully agree on whether it is an extension of BAU or an opportunity to create a whole new market. Instead of engaging in this debate, I propose to lay out what I see as four killer service categories for the wireless edge, which are:

  • Network services
  • Data compression/condensation
  • Compute offload
  • Artificial Intelligence

I’ll tackle each of these in turn.

Network Services

What is It?

Telco networks are becoming increasingly programmable, and operators now have the tools to expose network assets in a manner which application developers can easily consume. This is more of an evolutionary rather than a revolutionary case, but one of the most powerful. Think of the API’s that companies like Twilio offer. At their core, these APIs take straightforward telco capabilities such as SMS and voice calls and encapsulates them in a developer friendly offering. At the wireless edge, there are new types of capabilities that are mostly mobile, location specific and temporal, such as network congestion or traffic type within a cell sector. How could this information be monetized? It is not that operators have not tried. They have, and a good example is Norway’s Telenor. However, for all the power of the Telenor APIs, they never became a widespread solution. We still do not have a large-scale, multi-country way to programmatically access wireless network services, which hinders adoption by developers. Another barrier is the usability of the same; I daresay apart from a few exceptions such as Telenor, many developers would rather do without than go through the hoops to understand how to gain access and use these assets.

Why is it a Killer Service?

The infrastructure edge, at its essence, is all about location and things that are locally relevant. Use case scenarios such as anti-spoofing, location collaboration, and network utilization have limited benefit if handled by the central cloud; the edge serves as a quicker, simpler way of offering these services for developers, which can unlock new services for customers and new revenue streams for operators.

Data Compression/Condensation

What is It?

Compressing and condensing data close to the network edge addresses two challenges faced by today’s wireless networks:

  • Right now traffic is asymmetric. It is greater than 90-95% downstream, with limited upstream capacity. As enterprises and cities deploy IoT sensors that generate terabytes of data and consumers start looking to upload 8K video, it is anyone’s guess what long term impact that will have on the network.
  • Shipping data to a centralized facility is not free. As an end consumer, it may seem free, but when you’re transmitting zettabytes of data the costs add up quickly.

 

This then begets the question, is all this data relevant and essential to be handled in the central cloud? Would it be more prudent and economical to condense this close to the source in order to manage the network load?

Why is it a Killer Service?

While 5G promises significant increases in data-carrying capacity, simply shunting data around requires additional spend — and either the operator pays for this, or the developer of the service that uses this bandwidth does, or it’s offloaded to the end consumer in the form of higher prices. If there is a way to better handle this stream, it could provide manifold benefits to the trifecta in this equation.

Compute offload

What is It?

Compute offload refers to either moving compute off the device, or bringing the central cloud closer to the user. This is not necessarily about low latency; adding latency and strict SLA’s to the equation brings a host of other challenges. It is more about saving battery life, form factor and cost on mobile devices, while offering users significantly greater computing power at their fingertips. Mobile operators can offer cloud-like services at the edge and charge for them in ways similar to centralized cloud services. You would not need it to run all your applications, but on the occasions when you do, you can tap into the resources at the wireless edge. In turn, the wireless edge can run at a high utilization by serving a large number of users: a win-win scenario. With specialized computing capabilities at the edge, such as those offered by GPUs, end users may get longer lives out of their smartphones. If latest-generation phone capabilities care delivered via edge computing, then consumers may not need upgrade their phones every  6 to 18 months to get the latest capabilities.

Why is it a Killer Service?

Simply because applications are continually demanding more and more resources and device vendors are continuously having to provide more and better chipsets does not mean that model will continue to scale. There are two issues here. First, device release cycles are still much slower than app release timelines, which means even the highest-end phones can quickly get out of date. Second, it’s not clear that today’s device economics will continue to work out (I may stand corrected here if within 2 years people get used to paying $2000 for an iPhone, but I suspect there may be a threshold)

Artificial Intelligence

What is It?

Artificial Intelligence at the edge is really an extension of compute offload. It refers to operators hosting AI and machine learning microservices on the edge. These would likely be workloads that are too computationally intensive to be run on the end devices (especially sensor types) and the wireless edge could serve as a natural host.

Why is it a Killer Service?

With all the hype around AI, it is easy to miss the fact that we are just at the initial stages of discovering its true impact. By most estimates, AI will become increasingly commonplace over the next decade. The proliferation of microservices and the rise of serverless computing, make it practical to host AI-related services in an edge environment such that they may be called upon using secure resources, tasked to execute instructions and then to release compute resources when complete, all in a seamless fashion. AI at the edge could spawn an entire ecosystem of third-party microservices, built by companies that provide key enabling services rather than complete end user applications. A rich ecosystem of services would likely beget a marketplace focused on offering AI capabilities, similar to those of Microsoft and Algorithmia. Developers would have access to these services, which would be verified to work with edge infrastructure and available on a pay-as-needed basis; all factors further reducing the barrier to develop the next generation of pervasive and immersive applications for man or machine.

Summary

The next time you want to think about what to do with the infrastructure edge, consider these four killer services. Based upon where you are in your edge strategy and deployment, they could justify a business investment and help accelerate the large-scale rollout of edge computing.

Joseph Noronha is a Director at Detecon Inc., Deutsche Telekom’s management consulting division leading their practice in Emerging Technologies and Infrastructure. He has extensive “on the ground” experience with infrastructure players around the world spanning the Americas, Europe, Middle East, Africa and Asia. His interests lie in around Next generation Connectivity IoT, XaaS and more recently in Edge computing – from product conceptualization and ecosystem building to driving and managing the commercial deployment of these services

Opinions expressed in this article do not necessarily reflect the opinions of any person or entity other than the author.