Skip to main content
Discover the latest edge computing trends and technologies at ONE Summit, April 29-May 1 in San Jose | REGISTER
Category

Postcards from the Edge

Increasing BMS IQ via Edge Computing

By Postcards from the Edge

 

By Art King, Director of Enterprise Services
Courtesy of Connected Real Estate Magazine (Rich Berliner, Publisher)

We are on the cusp of an automation wave that has the potential to increase building management systems “BMS” IQ by adding awareness of tenant mobile devices across a facility or even a campus. The time is now to plan for this future because the technical foundation needs to be laid in order to be present when it arrives. (Multi-Access Edge Computing, MEC, or URLLC will be referred to as simply “edge computing” throughout the balance of this post.) Edge computing pilots started in 2011 and it is architected into the cellular network with 5G. With this ratification, industry and vertical market players are making significant investments in edge computing to put it to work. We’ll frame the business impact of this trend and then diagram the technology.

What Is Propelling This Automation Wave?

The assumption of 100% mobile device ownership has transformed our design thinking. In the drive to increase tenant satisfaction by reducing friction in accessing amenities in workspaces, there is an opportunity to shape a strategy around mobile devices. We see this emerging now in co-working spaces and enterprise remodels that drive for the same vibe as co-working. In these environments, an app can check you in, unlock doors, book conference rooms, and request resources. In the near future, this capability can be a service offered by the facility to all tenants or across an enterprise.

Additionally, our mobile devices now enable context-aware computing.

The Gartner IT Glossary defines context-aware computing as “a style of computing in which situational and environmental information about people, places and things is used to anticipate immediate needs and proactively offer enriched, situation-aware and usable content, functions and experiences.”

If you assume that every mobile device acts as a digital proxy for the owner, this is a driver for implementing edge computing to enhance the tenant experience in the building and potentially offer APIs to enterprise business applications so they can add a layer of context awareness.

Context-aware decisions primarily revolve around:

  • Are you a tenant or a public subscriber passing through building?
    Are you present in the building?

Many potential context-aware applications will use simple “presence;” – meaning the mobile is present on the local network. BMS applications will be able to consume information about whether a device is within the facility, where it is located, and then act accordingly. We foresee that BMS directories will use AI with automated provisioning to ease the task of identifying tenant mobiles. For instance, if the edge system sees the arrival and departure of the same device every day, it can be assumed that it’s a tenant.

How Will We Use This Capability?

Energy management is one example where location information can be leveraged by the BMS to optimize resource consumption by reducing dependency on simple timers or thermostats controlling lighting, HVAC, and other services. By the BMS taking actions as devices appear within the building and their associated density, utility consumption can be managed along with increasing comfort. Device density information can be used to optimize heating and cooling based on amount of people per square foot in a zone. Imagine not having to adjust a conference room thermostat, because it automatically adjusts to maintain optimal room temperature based on detected occupancy. This simple function provides ROI by managing both comfort and utility use.

For public safety protocol in large commercial and government buildings, location information can enhance process capabilities in emergencies. There are a number of useful capabilities that would be valuable to tenants and first responders.

Some brief examples:

  • In an emergency, it would be possible to text every device that is present and, by knowing their location, give them an ideal route out of the facility.
  • For emergency calls, knowing the location of caller can get first responders to them faster and, as health wearables improve, if a person had an event that resulted in unconsciousness their mobile device could contact emergency personnel on their behalf.
  • In a fire or other emergencies, the presence and location of devices could enhance rescue efforts. Additionally, incident commanders could reach out to contact people in the building to increase their situational awareness and advise them.

Finally, there are a vast amount of IoT services on the horizon that will also leverage context and presence to improve their ability to serve us.

What Does Edge Computing Look Like in the Future?

The above diagram shows how edge computing will provide the local breakout necessary to both route data locally and provide a presence event stream for BMS and enterprise tenant systems to subscribe to. This illustrates one possible future for system architects and developers to think about.

Many of the functions will end up as cloud services and be shared across many buildings so that the services can be easily affordable and potentially monetized.

What Can We Do Today to Prepare?

Build-Out Cellular Infrastructure

It’s an uneven world with some buildings having service improvements and many with nothing. Pressure continues to mount for robust indoor LTE coverage in poorly covered buildings by both current and prospective tenants.

But, not to worry, indoor cellular improvements for LTE are an investment in the future because they will carry over to 5G. Why? Well, 5G in the United States will be rolled out in a mode called 5G-NSA where NSA means “non-standalone.” In other words, 5G-NSA under the hood is LTE combined with 5G radio technology. An investment in LTE today is protected as it will be upgraded to 5G via the addition of 5G radios and edge computing in the future.

BMS Infrastructure

Major property owners in commercial real estate (CRE), enterprise, government, and higher education spaces should start the exploration process with their major providers of control systems, access management, elevators, utilities, security, etc. to understand their future plans. Developing a multi-year roadmap and contingency plans is necessary to reach the desired destination. Large property owners of all types may have access to strategic planners at the mobile operators and key infrastructure software providers to help develop their roadmaps and clarify the business cases. This will be a journey for the industry, but we are seeing the emergence of property technology (proptech) firms that are targeting this market and could act as the “glue” amongst the disparate systems and edge compute platforms that all have to act in concert to realize the vision.

What the Cutting Edge Looks Like Today

By Postcards from the Edge

By Jim Davis

Principal Analyst, Edge Research Group

 

Since last year’s State of the Edge Report I’ve presented at conferences in Mexico City, Toronto, Austin, Chicago, Dallas, Las Vegas, Richmond, San Francisco, and San Jose. I’ve hosted over one hundred phone calls with people from the US, Japan, UK, Spain, and Brazil, and I’ve engaged in conversations with folks worldwide in other mediums. What I’ve learned: edge computing is happening, and it is enabling the transformation of companies. 

Edge computing is appearing in many forms, and not all of them are what might be strictly termed edge computing deployments by the definitions in the Open Glossary of Edge Computing. But when viewed on a continuum, there is definitely a movement towards a new generation of edge computing.

Example: A company that manufactures equipment for oil and gas extraction has created a platform built on industrial-strength Linux and a standard x86 4-core processor to gather data for trucks used for fracking extraction of natural gas and petroleum. Customers are operating a $20 million fleet of vehicles under strenuous environmental conditions—keeping them running costs another $9 million a year. Using an in-house edge computing platform to gather 88,000 readings per second from each truck, this company can remotely monitor the condition of their equipment and automatically create a maintenance alert that simultaneously triggers a product order at the factory.

Many industries, especially those engaged in IIoT (industrial IoT) initiatives, don’t view themselves as using edge computing. Strictly speaking, they are often correct. There are also instances where equipment in a factory is instrumented, but the data is siloed (sometimes formatted in old or proprietary communications protocols). Initial steps in an edge computing strategy might involve extracting data and translating it into a standard format using standard server technology that’s deployed in a factory. Indeed, there are examples of companies looking to retrofit machine learning-assisted vision to “read” analog gauges on machinery and stream data back for monitoring and analytics. That wasn’t possible before.

 

Edge, AI are intertwined with digital transformation

Whether retrofitted or standard, companies are finding that it isn’t easy to find actionable insights from new data streams, which leads to the observation that edge computing, AI and ML technologies and digital transformation efforts are all intertwined. This has been particularly apparent when discussing IIoT case studies in industries such as manufacturing.

Opening up new data streams can lead to many different levels of change, including optimization or wholesale reinvention of a manufacturing process and creation of new business models.

Example: Returning to the previous example of the manufacturer of equipment for oil and gas companies: Drill pipes extending down from offshore drilling platforms to the sea bed are obviously critical components, with some 2.5 million pounds of load on the pipe and collar weld (where pipes are secured together), making maintenance critical. In the past, rig owners would remove the pipe and perform maintenance at scheduled intervals. The problem there is that, because of imperfections in the welding process (among other issues), the maintenance process itself often actually increases the chances of equipment failure.

By enabling monitoring of the conditions under which the equipment is actually being used, and applying ML models based on factory testing of materials and experience with previous equipment failures, the manufacturer can predict with a high degree of accuracy when the pipe needs repairs. The customers have been able to keep equipment in production for upwards of twice as long before conducting repairs. In short, they are generating more revenue in between maintenance cycles, and the manufacturer is now selling a new predictive maintenance service to the customer.  

Example: a cement manufacturing company looks to apply AI to its manufacturing process. Cement seems simple on the outside, but that belies the complex chemistry required to produce the consistent quality needed for building bridges and high-rise buildings. The company has plenty of sensor data from existing systems, as well as large data sets for training AI algorithms. 

After months of training (with hands-on assistance from a technology specialist), the company sees energy savings of 2-5% and a yield increase of 2% to 7%, along with reduced maintenance and system downtime. In a multi-billion dollar business, these results are going to impact the bottom line as the company rolls out the technology to all its plants around the world. Looking ahead, the company will gradually move towards leveraging AI to continuously manage all elements of the production process—something done manually in 10-minute intervals before—and enable autopilot mode (i.e., without human supervision) of the production system. 

 

Impact: Power and Cooling for the Edge (and Core) Cloud

Applying edge computing and AI to industry offers huge potential.. Simply looking at the physical infrastructure, it’s clear that having data centers closer to data sources can help solve the networking problem. But this also raises the issue of powering and cooling the systems that are driving the business insights. Whether processing data in a core cloud or edge cloud, enterprises need to take into account the cost of power. Make no mistake: AI, ML, and other data processing workloads are power-hungry.

Chips are already drawing in the range of 200-plus watts, and next-generation Intel Xeon chips are estimated to draw as much as 330 watts. How many chips are needed for AI workloads? Some supercomputer applications use tens of thousands of processors. Even dividing that workload among hundreds or tens of facilities still translates into significant power and cooling requirements.

Even in some well-developed markets sometimes lack adequate grid power for data centers, such as in regions of Europe and Asia. On-site power generation and other considerations will factor into sizing an edge datacenter—and determining whether it’s economically feasible at all.

All told, the impact of edge computing, IIoT data and the use of AI/ML will require the focus of more attention on the development of “right-sized” data centers. What size will they be? How many will be deployed in a given metro area? The challenge vendors in edge computing will face over the next 12-18 months will be to help customers build a new equation that contemplates location, energy, and connectivity for facilities that will accommodate the changing demands in enterprise workloads.


Based in Fresno, California, Edge Research Group provides market research, strategic advisory, and content marketing services for technology firms.