While organizations have been moving to modernize their IT infrastructures by adopting cloud-native Kubernetes technologies, the pandemic helped accelerate this movement. The surge in cloud services adoption during the lockdowns emphasized the need for organizations to make their business and technology models more agile.
As our guest speaker, Forrester Research Principal Analyst Lee Sustar explains in a recent webinar entitled “Three Trends Driving Cloud-Native Adoption
,” cloud-native technology is in a state of rapid and continual change, and organizations must adopt a cloud-native Kubernetes infrastructure that allows for continual change.
According to Forrester Research survey findings, cloud modernization ranked as the leading priority in 2021, along with adopting modern application development techniques. Also at the top of the list was the need to capture, transform, and deliver AI at scale, as well as the need to innovate with edge and IoT technologies.
As Lee explains, to keep pace with these trends, organizations must establish a solid cloud-native foundation that enables them to adopt a number of mutually supporting technologies that include cloud-native applications, Kubernetes, artificial intelligence (AI), and machine learning (ML).
An important point, Lee notes, is the key role the cloud has played in making AI capabilities more accessible to a larger number of organizations. “Kubernetes comes out of an effort to have big stateless customer-facing applications at scale,” he explains, “and didn’t necessarily strike anybody as a candidate for delivering AI a few years ago, but now it is.”
The fact that Kubernetes and AI share so many cloud-native qualities makes them mutually supportive technologies. Kubernetes, says Lee, is intersecting with AI/ML, which is reflected in the Forrester Research survey findings.
As Lee explains, cloud-native is emerging as a business requirement for a number of reasons, including the need for application development automation, scalability, stability, deployment flexibility, and to accommodate further innovation.
Kubernetes has strategic business value across the entire technology stack, including infrastructure, development, and applications, all of which is orchestrated by automated DevOps at the core.
Cloud-native is critical for “future-fit” requirements, a term that Lee explained was coined by Forrester Research to encompass the idea that your architecture should be built to anticipate and accommodate key new cloud technologies as they arise. Organizations that fail to adopt cloud-native infrastructures that can accommodate AI risk falling dangerously behind their competitors.
Expanding Adoption of Containers and Kubernetes
During the webinar, Lee shared findings from Forrester's 2019 survey in which 40% of North American enterprise infrastructure decision-makers said they expected to support containers/Kubernetes for new or existing projects. Forrester Research expects this number to exceed 50% in 2022 and become the majority of new deployments.
This data aligns with our second annual Kubernetes in the Enterprise report
that showed that 53% of all an organization's projects are currently in production on Kubernetes. This trend is projected to continue in 2022, which again is abetted by the ongoing hybrid work-from-home style that is increasing the need for cloud-based services.
Kubernetes a Perfect Platform for AI/ML Workloads
As Kubernetes and AI/ML advance in tandem, Kubernetes is evolving to ease AI/ML adoption, Lee explained. In his article entitled “The Future Is Smart: Cloud Native + AI,
” D2iQ CEO Tobi Knaup details why Kubernetes is the ideal platform for running AI/ML applications.
The move to smart cloud-native is being aided by Kubeflow, the chief open-source platform dedicated to making deployments of AI/ML workflows on Kubernetes simple, portable, and scalable. Kubeflow provides an end-to-end platform for AI/ML workflows that enables data scientists to build AI/ML applications without having to worry about the underlying Kubernetes infrastructure.
Kubeflow Automated PipeLines Engine (KALE) simplifies the orchestration of AI/ML workflows on Kubernetes. D2iQ’s Kaptain AI/ML add-on reduces the complexity of Kubeflow and Kubernetes even further through automation, tooling, and integration of a select group of services.
Taming Kubernetes Complexity
As Lee notes, Kubernetes is complex and a do-it-yourself Kubernetes deployment is beyond the scope of most organizations. Most enterprises (80% according to the Flexera State of the Cloud survey
) realize they don't have the skills or budget to take on a DIY Kubernetes deployment. Fortunately, Kubernetes vendors like D2iQ and cloud service providers like AWS are providing solutions that reduce Kubernetes complexity and enable organizations to be up and running quickly and reliably.
In his segment, D2iQ CEO Tobi Knaup explains how the D2iQ Kubernetes Platform (DKP) simplifies Kubernetes and AI/ML deployment and management by providing “a complete, open, production-grade stack that is ready to go.” This includes GitOps workflow, which, he says, “is the most bulletproof way to manage cloud-native infrastructure.”
Other key DKP features include observability, military-grade security, application catalog, real-time cost management, AI/ML simplification, and Insights that help automate Kubernetes management. “DKP is a smart cloud-native platform capable of running smart cloud-native applications,” Tobi notes. “This means it can run AI/ML workloads and the platform itself is smart.”
A critical feature of DKP, he noted, is that the platform is based on pure open-source Kubernetes, which enables customers to deploy on any cloud, on-premise, edge, or air-gapped environment and enjoy the full innovation capabilities of the open-source community.
Tobi describes how SAIC, a joint D2iQ and AWS customer, has deployed D2iQ to achieve a fully secure, classified, air-gapped Kubernetes environment for the U.S. Department of Defense and intelligence community that could run AI/ML workloads across multiple clouds and at the edge.
Automation, a centralized management plane, and the use of declarative APIs “from top to bottom” are key to enabling customer success in complex AI/ML and edge environments like these, Tobi added.