By Emeka Nwafor
The “edge-computing” qualifier has been coopted by different market segments to characterize use cases that are as varied as the equipment and compute infrastructure at the edge of a cellular network (e.g. mobile edge computing), or the on-premise monitoring and control systems that power manufacturing environments.
Wind River characterizes edge computing as a distributed and decentralized computing model where the underlying compute infrastructure exists close to the source that is generating data and resides away from centralized compute in the cloud. Industrial automation controllers, autonomous vehicle controllers, remote monitoring equipment, SCADA, DCS, PLC, gateways, and on-premise servers are all examples of edge computing infrastructure. IDC forecasts that the spend on building out edge infrastructure will grow with a CAGR of 22% thru 2021, outpacing spending on core infrastructure by 2.5X.
The compelling attribute of edge computing is its application to address today’s operational reality that the traditional methods to expand capacity to meet demands has run its course. The pressures of workforce availability, workforce cost, energy utilization, shortened time lines, and increasing demand for tailored products and services is driving a rethinking of how enterprises use digital technology to radically change performance – at its core, this is what digital transformation is about.
The vast majority of today’s far edge devices (i.e., embedded monitoring and control systems) that constitute an enterprise’s operational technology environment are fixed function monolithic devices that were designed to do a set of specific tasks – repeatedly and reliably. Upgrading these systems involves replacing them – a cumbersome and often costly approach that typically results in downtime. These far edge devices are evolving to more modern architectures that embrace the continuous integration and delivery attributes of modern enterprise software. This allows the opening up these systems to be more dynamic since they can execute workloads (i.e., microservices) that are hardware independent.
The progression towards automated intelligent and autonomous systems is resulting in a change in the focus point for where to apply learning systems. To drive the productivity increases, AI/ML systems are evolving to encompass more of the edge computing infrastructure – inclusive of on-premise servers, gateways, and end devices. The productivity increases result from applying learning system algorithms closer to the source of the data stream to react and control the system with low latency.
As more intelligence moves to the edge, it is important to embrace that this is not a replacement for cloud or enterprise computing. Rather as, edge computing as a way of augmenting cloud computing by enabling a balancing of edge and cloud computing. Enterprises should look at their edge and cloud computing infrastructure as a continuum that enables them to put workloads where they need them. With this approach you can realize the benefits of the best of both worlds; leverage the elastic and scalable compute power of the cloud while lowering overall TCO by normalizing and processing data streams closer to their source.
In conclusion, enterprises will need to embrace a paradigm shift when executing on a digital transformation strategy. There are aspects of cloud computing – both development and operating – that extend to edge computing, such as those referenced above. These need to be planned for and implemented while managing the heterogenous attributes of the edge compute infrastructure, and without sacrificing the real-time, high-availability, and safety attributes that characterize this infrastructure. For nearly 40 years, Wind River has been delivering middleware, tools, and services that have been used in over 2 billion devices – most of them powering critical infrastructure at the edge. The next chapter in our journey is working these enterprise customers to reap the business benefits of digital transformation.