Skip to main content
Discover the latest edge computing trends and technologies at ONE Summit, April 29-May 1 in San Jose | REGISTER

Uniting Behind a Cloud-Native Edge

By October 2, 2018October 5th, 2020Blog
Woman sitting on mountain ledge

As the number of internet-connected devices reaches into the billions, we need cloud-native models that can facilitate edge and IoT applications. Jason Shepherd of the open source EdgeX Foundry explores this concept.

Editor’s Note: This is a guest post from an industry expert. The State of the Edge blog welcomes diverse opinions from industry practitioners, analysts, and researchers, highlighting thought leadership in all areas of edge computing and adjacent technologies. If you’d like to propose an article, please see our Submission Guidelines.

A few years back, the number of devices connected to the internet surpassed the number of people. In fact, it’s estimated that in 2019, there will be more connected “things” online than traditional end-user devices. Estimates vary widely on the total number of connected things over time, but no matter how how you count, it’s going to be a lot. All of these things represent new actors on the internet and a huge catalyst for digital transformation.

The proliferation of connected things creates an inexorable shift to distributed computing, particularly at the edge of the network—often referred to as edge computing. For edge computing to be as robust as the cloud, we need a cloud-native ecosystem for building and deploying applications.

In this article, I will try to provide historical and technical context for edge computing, while also clearing up some of the confusing edge lingo that’s emerged. I’ll also touch upon how we must extend cloud-native practices to the edge, highlighting the importance of the EdgeX Foundry and Akraino projects within The Linux Foundation for facilitating an interoperable edge computing ecosystem.

The Pendulum Shifts to Edge

In the history of computing, the pendulum has faithfully swung every 10 to 15 years between centralized and distributed models. Given the sheer volume of networked devices going forward, it’s inevitable that we need distributed architectures because it’s simply not feasible to send all the collected data directly to the cloud.

There are three key technical requirements driving demand for edge computing:

  • Latency: It doesn’t matter how fast and reliable your network is — you just don’t deploy something like a car airbag from a cloud data center thousands of miles away;
  • Bandwidth: There’s an inherent cost associated when moving data, which is especially bad when transporting over cellular and even worse via satellite.
  • Security: Many legacy systems were never designed to be connected to broader networks, let alone the internet. Edge computing nodes close to field devices can perform functions such as root of trust, identity, encryption, segmentation and threat analytics for these as well as highly constrained devices that don’t have the horsepower to protect themselves. It’s important to be as close to the data source as possible so any issues are remedied before they proliferate and wreak havoc on broader networks.

Beyond the above listed technical reasons, the kicker for needing an increasing amount of edge computing is the consideration of the total lifecycle cost of data. People that start with heavily cloud-centric solutions often quickly realize that chatty IoT devices hitting public cloud APIs can get super expensive. And on top of that, you then have to pay to get your own data back!

Exploring the Many Edges

So what is the “edge”? Fact is, there isn’t a single one.

  • To a telco, the edge is the bottom of their cell towers or at their baseband units. This is the closest location to subscribers that they have complete control over moving content and services to in order to minimize latency for optimal end user experience and reduce overall bandwidth consumption throughout their core networks.
  • To an ISP or Content Delivery Network (CDN), the edge is their IT equipment in data centers on key internet hubs – they might call this the “cloud edge”. Same reasons.
  • Other edges include on-prem data centers, both the traditional kind as well as the ever-increasing proliferation of micro-modular data centers that help get more server class compute closer to the producers and consumers of data.
  • Then comes localized systems, including hyper-converged infrastructure and edge gateways sitting immediately upstream of sensors and control systems. A key differentiator for all of these on-prem edges is that they’re on the same LAN/PAN as the field devices themselves, so now we’re talking benefits for security and uptime for mission-critical applications.
  • And to an OT (Operational Technology) professional, the edge means the controllers and field devices (e.g. sensors and actuators) that gather data from the physical word and run their processes.

In effect, the location of edge computing is based on context, but all edge computing initiatives share the same goal of moving compute as close as both necessary and feasible to the users and devices needing it.

Fog vs. Cloud

The term “fog” is … foggy to a lot of people. Simply put, fog computing refers to the combination of all the edges and the networks in between effectively everything from device to cloud. The fog and cloud are not incompatible. In fact, fog and cloud will work together.

The bottom line is that regardless of how we label things, we need scalable solutions for distributed computing resources to work together along with public, private and hybrid clouds while meeting the needs of OT and IT organizations in areas such as data ingestion, analytics, security and manageability.

IoT Needs a Cloud-Native Edge to scale

Key to the concept of cloud-native is the utilization of modern devops, continuous delivery, loosely-coupled microservices and overall platform-independence. Important to understand is that the term cloud-native is more about how software is built and deployed than where’s it’s actually run.

It’s only logical that the same reasons that cloud-native principles help companies develop and deploy massively scalable applications in the cloud also make them highly applicable across all the different edges. In fact, I would contend that these principles are necessary in advanced class IoT.

EdgeX Foundry: Facilitating an Interoperable Cloud-Native Edge Ecosystem

Launched last year by The Linux Foundation and already backed by nearly 70 member organizations spanning 16 countries, the goal of the vendor-neutral EdgeX Foundry open source project is to build an open framework for edge computing to facilitate an interoperable cloud-native edge ecosystem. It’s not a standard, rather looking to be a defacto standard framework to bring together any mix of existing connectivity protocols with an ecosystem of value-added applications.

The EdgeX project is focused on doing just enough to drive industry alignment through common APIs governed by the project’s Technical Steering Committee without encroaching on where the real IoT money is – infrastructure, applications and services.

Nobody wins if you’re the hundredth person this week to write an application level driver for that same device using the same “standard,” or trying to come up with foundational tools for security and management that end users can trust. A term I heard at a conference last week is that this type of stuff is “undifferentiated heavy lifting.”

You can read about the EdgeX community’s accomplishments in the first nine months after the April 2017 project launch as well as key tenets and priorities for this year in my post here. Eric Brown of Linux.com also did a great writeup on the recent “California” code release as well as what’s in store for the project’s “Delhi” release in October where we’re also going bigger at IoT Solutions World Congress with the launch of developer kits and more Vertical Solution Working Groups in areas such as Buildings and Transportation.

For more info on the project or to learn how to get involved including in these domain-specific working groups visit www.edgexfoundry.org or email info@edgexfoundry.org. We invite you to join the growing community as a project member, contributor or end user. Or better yet, all of the above!

Collaboration with the Akraino project

Arpit Joshipura, the GM of Networking and Orchestration at The Linux Foundation, often talks about the mission and scope for the Akraino project. I had the pleasure of attending the Akraino Summit and the energy in the room was fantastic. We talked about how the EdgeX Foundry and Akraino projects are highly complementary and how we can collaborate to ensure that each effort is valuable independently but curated to work great together as a full open source stack with interoperability APIs that address OT and IT needs across the many edges.

In particular, we identified the opportunity to increase context awareness between EdgeX application-level APIs for secure and manageable interoperability between devices and applications and Akraino APIs for fostering interoperability between underlying distributed edge infrastructure functions such as workload orchestration and networking. The possibilities here are huge—underlying infrastructure that’s able to dynamically optimize itself based on context to serve the needs of any collection of interoperable EdgeX-compliant microservices (using the key EdgeX APIs regardless of how proprietary the overall code is).

We also spoke about the potential to coordinate efforts between the Vertical Solution Working Groups in EdgeX and the domain-specific blueprints offered by Akraino. Further, we discussed bridging to other key edge efforts including testbed activity that’s spinning up with EdgeX in the Industrial Internet Consortium as well as broad, unifying resources such as the Open Glossary of Edge Computing.

Winning together

Net-net, we all win if we work together to drive an open and massively-scalable cloud-native edge ecosystem. Plus, by linking important open source efforts like EdgeX and Akraino with distributed ledger technologies over time we can achieve what I believe is the holy grail of digital – monetizing data, resource-sharing and services through people you don’t even know!

Jason Shepherd is Chair of the EdgeX Foundry Governing Board and CTO for IoT and Edge Computing at Dell Technologies. To read more about the concepts in this article and beyond, check out his 5-part Tech Target blog series running through mid-October. Follow Jason and EdgeX Foundry on Twitter at @defshepherd and @EdgeXFoundry, respectively.

Opinions expressed in this article do not necessarily reflect the opinions of any person or entity other than the author.