There is a natural tendency to think of the spectrum of platforms that developers need to employ to build and deploy applications in isolation from one another. However, all those platforms are interconnected with one another. An application that is designed to primarily run on an edge computing platform will need to invoke a wide range of backend services that may reside in the cloud or an on-premises IT environment.
Wind River, to provide that capability, has updated the Wind River Studio development environment to provide developers with a way to programmatically invoke platforms at a higher level of abstraction irrespective of where they are physically located, says Wind River CTO Paul Miller. ‘This is a big pivot,” he says. “It’s a modern framework.”
For example, Wind River has added a visual tool to define, monitor, and execute complex software pipelines for applications that would be deployed on its VxWorks and Wind River Linux platforms. Those pipelines can then be integrated with application orchestration tools that enable IT teams to deploy, configure and manage the lifecycle of complex applications within the context of a DevOps workflow spanning both its platforms and public cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).
In addition, Wind River is making available tools to create digital twins along with support for a 5G vRAN accelerator, GPUs enabled for 5G and augmented reality applications.
Finally, Wind River provides a digital feedback loop based on real-time analytics that encompasses all the operating systems, applications, infrastructure, systems, and devices to provide analytics in real-time.
As applications are increasingly deployed across what is becoming an extended distributed computing environment, the line between various classes of platforms is starting to blur. It’s not really feasible to deploy an application at the edge, for example, that is not dependent on a wide range of services running on other edge computing platforms, in a local data center, or a cloud computing environment. From the perspective of an application developer, at least, it’s all just one logical set of infrastructure for deployed a distributed application that consists of multiple interdependent microservices.
As that increasingly becomes the case, it will become apparent just how artificial the delineations made between various classes of platforms really are. The phrase cloud computing may not completely dissipate as a result. However, it’s clear the underlying platforms that distributed applications are deployed on are becoming less relevant. As Kubernetes clusters become more widely employed, there is a common application programming interface (API) manifesting itself across a wide range of types of IT infrastructure.
In the meantime, developers should expect to see other frameworks they employ to evolve similarly. Of course, it will be up to developers to decide what to call this brave new world of IT. Still, the building and deployment of distributed computing applications should get simpler in the months ahead as the need to push more application code to the edge becomes more apparent.