AI and Cloud Native Alchemize the Future of Enterprise IT

After hitting the top of the hype cycle, cloud native and various artificial intelligence technologies combine to drive IT capabilities into the future.

By Ken Kaplan

By Ken Kaplan September 6, 2024

Generative AI, AI-augmented and Cloud Native capabilities hit the peak of inflated expectation in Gartner’s 2023 Hype Cycle for Emerging Technologies.

“Every product briefing I go to, if it doesn’t have the word ‘AI’ in it, I’m in the wrong room,” Steve McDowell, chief analyst at NAND Research, told The Forecast. 

“It’s all anybody wants, whether it’s relevant or not.”

Gartner’s Hype Cycle technologies are typically at an early or embryonic stage, Gartner Distinguished VP Analyst Arun Chandrasekaran stated in the 2023 Emerging Technologies report.

“Great uncertainty exists about how they will evolve, so there are greater risks for deployment, but potentially greater benefits for early adopters.”

One year later, those three capabilities already power most of Gartner dubbed Top 10 Strategic Technology Trends for 2024, including:

  1. AI Trust, Risk and Security Management (AI TRiSM)

  2. Platform Engineering

  3. AI-Augmented Development

  4. Industry Cloud Platforms

  5. Intelligent Applications

  6. Democratized Generative AI

  7. Augmented Connected Workforce

While some organizations are just exploring how to take advantage of these technologies, many are pushing forward, learning and kicking the tires. They see cloud native and AI increasingly intertwined and rising in importance to the future of their business. Moving beyond the hype surrounding these technologies requires understanding real-world use cases and then squaring those capabilities with real business needs and opportunities, according to McDowell.

“Once you start talking about what the business use is, and the value you’re trying to extract from any technology, that drives technology decisions,” he said.

He said 2024 brought a huge shift as CIOs explore how AI can enable digital transformation and drive worker productivity and IT efficiency. Many moved ahead quickly as they explored how to manage the data and put applications in the optimal environment, whether in their own data center, at edge computing locations or in the public cloud. 

Related

IT Leaders Get AI-Ready and Go

Using AI and cloud native technologies are like previous transformational technologies, McDowell said.

“Although timelines are getting shorter,” he said. “Understand it, enable it, play with it, and then drive change.” 

Interest in Cloud Native and AI Coalesce

Enthusiasm around cloud native-powered AI applications stems from the potential to make better, faster data-driven decisions and achieve higher productivity, especially for mundane or repeated tasks as well as managing IT operations, which have predominantly become a complex web of hybrid multicloud environments. The growing need for efficiency and interoperability to manage applications and data across different environments intensifies the hunger for and hype around cloud native AI capabilities. 

Popular cloud native platforms like Kubernetes have matured in recent years, making container orchestration more accessible and common for building and managing applications and microservices. Because these cloud native applications aren’t dependent on particular IT infrastructure, IT teams have more options for running those applications wherever it makes business sense.

Related

AI, Cloud Native and Hybrid Cloud Fuse to Run Apps and Data Anywhere

First developed by Google, Kubernetes has proven effective at handling various workloads within Google’s infrastructure for years. Leading companies like OpenAI, Spotify, and Uber also deploy models on Kubernetes, highlighting its ability to manage dynamic and complex computational demands. Over 60% of organizations have adopted Kubernetes, and more companies are planning to do so, according to research firm Statista. 

“The dependency on Kubernetes is apparent, especially in large organizations,” stated Stasta. “Kubernetes has become the fastest-growing project in the history of open-source software. It is second only to Linux, with a market size estimated at USD 1.46 billion in 2019. Moreover, it's expected to increase at a compound annual growth rate (CAGR) of 23.4% by 2031.”

The surge in use of Kubernetes shows how this platform proves to be crucial in today's business operations is rolling into the world of AI application development, according to Dan Ciruli, senior director of product management at Nutanix.

“All of the big companies, they're running all of that AI stuff on Kubernetes,” Ciruli said in an interview with The Forecast.

“I don't know of anybody who's really writing new AI-based apps and running those models in VMs (virtual machines). That's all happening in Kubernetes. So in that sense, AI is built on cloud native.”

Kubernetes offers unparalleled flexibility in resource management, particularly for expensive and scarce resources like GPUs, ensuring optimal utilization crucial for AI training and inference. It also facilitates rapid iteration and deployment, essential for AI development where models and applications require frequent updates and improvements. This enables organizations to bring AI-driven solutions to market more quickly and enhances operational efficiency through automation and intelligent resource management.

Related

Managing AI at the Edge

“It's a great fit for these AI workloads because they tend to be very dynamic”' Tobi Knaup, senior director and general manager for Cloud Native at Nutanix, told The Forecast.

“They need to share resources, very expensive resources in this case. GPUs are very expensive and in short supply. Also, organizations want to iterate on AI very quickly. So Kubernetes really enables that. It enables people to ship software fast.” 

Knaup said Nutanix’s Enterprise AI platform can run on Kubernetes clusters, making it easy for organizations to put new AI models into production. His team is integrating AI into Kubernetes capabilities (see Nutanix Kubernetes Platform) in the Nutanix Cloud Platform to help enterprises overcome challenges related to cloud-native technology adoption. For example, he said the AI Navigator AI-powered chatbot integrated into the Nutanix Kubernetes®  Platform can assist engineers by providing solutions to common issues and offering real-time insights into system configuration and performance.

Related

The Shift to Cloud Native Business Application Development

Kubernetes-run containers' modular and portable nature ensures that AI applications can be deployed across on-premises data centers to public clouds, without vendor lock-in, which is crucial for organizations pursuing hybrid and multicloud strategies that prioritize cost, performance and data sovereignty requirements.

Looking Ahead

Cloud native-powered AI applications can be transformative, and their potential is just beginning to be realized, according to Ciruli. 

“We’ve only seen the tip of the iceberg,” he said.

“I'm not the first person to make this analogy, but watching AI right now reminds me of the 1990s internet and realizing we don't know all the ways this technology is going to change everything, but it's clear it will change everything. That's where AI is right now.”

It's very early, but AI innovation is rapidly infusing into enterprise IT operations and application development.

“You can already tell that AI will be everywhere: in the data center, in the cloud, at the edge, on your phone, on your watch. Everywhere.”

For AI to work everywhere, applications and data need to run from anywhere. 

Ken Kaplan is Editor in Chief for The Forecast by Nutanix. Find him on X @kenekaplan.

© 2024 Nutanix, Inc. All rights reserved. For additional legal information, please go here.

Related Articles