Cloud native thrives in the COVID-19 era

Never would I have imagined that a sub-microscopic infectious organism would be the greatest catalyst for digital transformation. And in this era of COVID-19, a key enabler of digital transformation acceleration is cloud computing, not just as an operating model but also as a technological shift. In particular, there is increased focus amongst large and smaller organisations with the cloud native approach to software development. The use of microservices, containers and DevOps is fuelling the adoption of hybrid cloud architectures, infusing AI in enterprises, and exploiting the full potential of 5G and edge computing.

In order to appreciate the true value of the cloud native approach, it is important to step back and reflect on how it differs from previous approaches to building applications. Since starting as a developer in the early 1990s, I do not remember a time when we were not trying to build enterprise applications based on projections of future customer needs and business models. Unfortunately, this typically resulted in sub-optimal use of infrastructure that was sized for peak workloads, loss of agility due to dependency on underlying proprietary platform and rigid monolith enterprise applications unable to keep pace with unexpected changes in business direction or customer patterns.

But now, with the cloud native approach, you are inherently building applications that are better at responding to change and uncertainty. Cloud native applications adapt and evolve with new feature and functionality released incrementally but more quickly, reliably and frequently with less risk. Experimenting ideas, products or services with customers, in a world that is increasingly unpredictable, is also economically feasible and fast to deploy. And for enterprises to remain competitive, it is important to continually innovate and add features to established products and services. The extensible and robust nature of cloud native architectures enables this at lower costs without compromising operational efficiency and security rigour.

Cloud native also fundamentally boosts DevOps by enabling further automation of existing decision points between development teams and IT operations. By transforming decisions about provisioning, scaling and zero-downtime deployment into automated tasks, we move closer to applications and systems that respond better to market dynamics or volatility. At the same time, cloud-native helps organisations move away from costly always-on infrastructure with elastic computing, metered billing and pay-per-use models. Risk of massive system failures are also mitigated with cloud-native applications that are based on loosely coupled microservices architecture.

With enterprise Kubernetes platforms like Red Hat OpenShift, cloud-native is no longer confined to the public cloud. Distributed cloud-native applications can be deployed across hybrid cloud architectures that include public cloud, private cloud and on-premise data centre. When there is a regulatory policy change or a change in security posture or when a cloud service provider changes their pricing, you now have the flexibility to freely move different components of the enterprise application to any infrastructure – on or off-premise. However, it is critical to evaluate truly open-source Kubernetes distributions as some enterprise distributions will include open components, but ultimately lock you into proprietary underpinnings.

In the current pandemic, AI is helping organizations with everything from tackling increased security risks with a remote workforce to reducing overwhelming call volumes facing healthcare agencies. Here, cloud native’s dynamic architecture enables AI applications access to data regardless of location, processing power for distinct computational need, and appropriate analytics or machine learning capabilities. The scalable nature of cloud-native is also well suited to managing variability and fluctuations in data streams, while a composite AI application consisting of components with different dataset requirements can be deployed as distributed self-contained microservices.

Recently, global network traffic patterns indicate an increased shift to decentralized computing triggered by social distancing. This is not only because of remote work or learning, but also accelerated automation of processes in industries forced to rely on a reduced workforce, or intentionally trying to minimize human interaction. Agile DevOps, lightweight and portable containers leveraging an enterprise Kubernetes platform, and loosely coupled microservices are apt to address potential edge computing issues related to adequate processing power, connectivity, bandwidth and latency. With cloud-native, you can deploy AI applications that make real-time decisions at the edge, while also performing big data analysis in the data centre.

These are extraordinary times and it is important that we consciously and purposefully rethink architectural and technology choices, not just to survive this crisis but to come out better and stronger. Private and public sector alike are already experiencing first-hand the resiliency and agility the cloud-native and hybrid cloud approach offers. This is by far, one of those rare pivotal technological shifts that will enable organizations to realize unmatched capabilities for growth and value creation for years to come.

The author is Shanker V Selvadurai, Vice President & CTO of IBM Cloud & Cognitive Software, Asia Pacific

LEAVE A REPLY

Please enter your comment!
Please enter your name here