Skip to main content
Discover the latest edge computing trends and technologies at ONE Summit, April 29-May 1 in San Jose | REGISTER

The Inevitable Obviousness of the Wireless Edge Cloud

By August 14, 2018October 5th, 2020Blog
Cellular Towers

Peter Christy, a former 451 analyst, asks the question: Is a wireless edge cloud a bold new wave of computing, or just as the obvious?

Note: This is a guest post from an industry expert. The State of the Edge blog welcomes diverse opinions from industry practitioners, analysts, and researchers, highlighting thought leadership in all areas of edge computing and adjacent technologies. If you’d like to propose an article, please see our Submission Guidelines.

Thirty-five years ago, if a science-fiction writer extrapolated the future from an IBM PC connected to a timesharing system by a 1200 baud modem, he or she might have envisioned today’s wireless Internet. Armed with a rudimentary understanding of Moore’s “Law” (2X improvements every two years), it wouldn’t be that far fetched to envision a powerful, wireless, battery-powered handheld computer intimately attached to a rich set of remote (cloud) resources. It’s even less astounding that we live in that world today, considering technology has improved by a factor of 217 (roughly 100 million times) in that time period. For an imaginative free spirit, today’s smartphone would have been pretty obvious. It’s only complex when you know the history.

A Convoluted Path to the Internet

The precursor to the modern Internet was born in 1969 as the ARPANET, a Department of Defense Computer Science research project that connected three research centers in California and one in Utah via 56Kb “backbone” links. The engineers who designed that network were solving for a military problem—building a telephony network that could survive a nuclear war—but they ended up creating the understructure for today’s internet, though it would take another 20 years. Fiber optic communication took a decade to arrive. The IBM PC, the ancestor of the smartphone, didn’t show up until 1981, and the Mac in 1984. The World Wide Web didn’t come until 1990, over two decades after those four research centers were connected. The iPhone — the personal computer that we always wanted — wasn’t announced until nearly 40 years after the ARPANET.

And then came the wireless internet, for which the iPhone was the turning point. Demand created by iPhone users drove the buildout of the 4G/LTE network, and that only in the last decade, 45 years after the Internet. This is the convoluted and time-consuming history that gave us today’s ubiquitous, high-bandwidth, cost-effective wireless Internet.

The path to today’s internet might be labyrinthine, but wasn’t it obvious this is what we wanted all along?

The Emergence of Cloud Computing

TImesharing systems—multi-user systems that provided computing services without each user having to buy and operate a computer—first showed up in 1964 (the Dartmouth Time Sharing System), five years before the Internet. The time sharing computers occupied entire rooms. They were expensive, cumbersome and few and far between, so we let many users share them from remote locations by connecting to them with “dumb” terminals using voice communication links with modems.  

As computers got more powerful and cheaper, we momentarily stopped sharing them as we all got our own “personal” computer that sat on our desk. Many of the early PCs (the Apple II, IBM PC) were often not even connected to a network—files and data were shared on floppy disks. Even when there was a “network,” it was typically to support file sharing. Sophisticated businesses would implement centralized storage on a “server” and applications that were shared or needed bigger systems started appearing on those servers as well

These servers, as they were called, became increasingly more difficult to operate and even small businesses had to start hiring IT experts to maintain even the simplest of systems. As computers got cheaper and cheaper, the complexity and cost of running them grew, and the desirability of using a managed computing service increased. As companies became comfortable “outsourcing” their servers to third parties, the door to cloud was opened.

VMWare introduced robust virtualization in 2001 that let disparate software workloads share the same hardware, making these new centralized servers look a lot like the old time sharing systems, only running modern applications. Virtualization became the definitive way to share common infrastructure while maintaining security between clients, which paved the way for the massive shift from on-premises servers to what we now call cloud computing.

The  seminal event (the “iPhone” of cloud computing) was Amazon’s unveiling of Amazon Web Services (AWS) in 2006. AWS offered virtual machines as an on-demand, pay as you go service. All of a sudden the distinction between timesharing and having your own server essentially disappeared. Anything you could do on your own server you could do on AWS without buying and operating the computer.

The Obvious Arrival of the Wireless Edge

Today, the number of mobile devices exceeds the population of the planet. With the advent of 5G mobile services and the accelerating demand for low-latency clouds, we’re seeing a next-generation wireless edge cloud emerge.

Operational automation became the final missing link required to make edge cloud computing possible. The hyperscale cloud providers all realized they had to reduce human requirements in their operations. First, the only way to run massively-scaled systems in high availability is to eliminate or at least mitigate the possibility of human error. Second, humans managing hundreds of thousands of servers in a data center is not only be unwieldy, but slow and error-prone. The major cloud service providers all adopted what could be called a “NoOps” strategies (in contrast to DevOps; Google’s Site Reliability Engineering offers a documented example). Edge cloud computing, comprised of thousands small data centers housing resources in un-manned locations, requires automated deployment and operation which will evolve naturally out of the large scale automation already developed.

The goal of edge computing is to maximize the performance of “cloud” (on demand, managed services) by locating key resources so they can be accessed via the Internet with minimum latency and jitter and maximum bandwidth. In other words, to provide services that are close as possible to what you could do with a local server without having to buy or operate the service. As was the case with the wireless Internet, it took a lot of hard work and serial invention to get to where we are today with edge cloud computing. But as was the case with networking the answer is obvious — it’s what you would want and expect if you didn’t know how hard it was to create it.

As cloud providers and companies like MobiledgeX provide managed services for placing workloads out at the edge of the wireless network, a wireless edge cloud becomes the natural outcome. The wireless edge cloud will bring all the conveniences of cloud computing to the edge of the network, enabling the next-generation of wireless applications, including mobile AR/VR, autonomous vehicles and large-scale IoT.

Obvious, right?

Peter Christy is an independent industry analyst and marketing consultant. Peter was Research Director at 451 Research and ran the networking service earlier, and before that a founder and partner at Internet Research Group. Peter was one of the first analysts to cover content delivery networks when they emerged, and tracked and covered network and application acceleration technology and services since. Recently he has worked with MobiledgeX, a Deutsche Telekom funded, Silicon Valley located startup that is building an edge platform. His first post on the State of the Edge blog was Edge Platforms.

Opinions expressed in this article do not necessarily reflect the opinions of any person or entity other than the author