News

Study Shows Big Uptake of Enterprise AI and Cloud Native Technologies

As generative AI workloads and cloud native technologies proliferate, global decision-makers surveyed for the 2025 Enterprise Cloud Index cite infrastructure, security and talent issues as top deployment and scalability barriers.

February 12, 2025

Like so many tech paradigm shifts that preceded it, Generative AI is seeing enterprises overhaul major aspects of their business and IT operations to accommodate a new wave of computing.

The recently published 7th Annual Nutanix Enterprise Cloud Index (ECI) report, which combines input from 1,500 IT and business decision-makers worldwide, indicates there’s a significant technological shift underway as GenAI enters a hockey-stick growth phase. This is happening just two years after OpenAI's ChatGPT kicked off the commercial GenAI industry.

The ECI findings showed that beneath the rapid adoption of new application workloads like GenAI is widespread adoption of cloud native technologies that enable seamless, secure access to data across hybrid and multicloud IT environments.

“This year’s ECI revealed key trends that we’re hearing from customers,” said Lee Caswell, senior vice president of product and solutions marketing at Nutanix. 

Big challenges arise from these key trends, including scaling GenAI workloads, data governance, privacy and visibility, and integration with existing IT infrastructure.

“To successfully unlock ROI with GenAI projects, organizations need to take a holistic approach to modernizing applications and infrastructure and embrace containerization,” Caswell said.

State of Enterprise AI Deployment

Based on research collected in late 2024 by U.K. researcher Vanson Bourne for Nutanix, the latest ECI report found that nearly 85% of responding companies already had a GenAI deployment strategy in place and 55% were actively implementing it. Only 2% said they hadn’t yet created a plan for GenAI deployment, and none said they had no intentions of creating a GenAI strategy.

Respondents agree that infrastructure and operational changes are necessary to support GenAI, which allows the auto-generation of content such as text, video, images and music. In facilitating change, most respondents (98%) cited difficulties scaling the compute-intensive technology from development to production. So it’s no surprise that more than half said they are prioritizing investments in infrastructure (54%) and the skills development (52%) needed to support an AI-centric cloud computing environment.

Top AI Trends

ECI findings strongly indicate that the success of GenAI across enterprises requires a modernized IT operation that leverages cloud native system architectures, application containerization, hardware acceleration and an expanded approach to data governance. In addition, these new capabilities require robust training and skills acquisition, which currently are scarce within the majority of responding organizations.

Given the transitional stage of enterprise deployment strategies, 42% of organizations expect to just break even or face losses with their GenAI projects in the coming year, while 56% expect some gain. The situation is expected to turn around quickly, however, with 70% expecting a positive ROI within three years as their AI-friendly technology and processes mature.

Early Focus and Challenges

Respondents reported that their near-term GenAI focus is to improve productivity levels and efficiency, particularly in the areas of customer support and experience. They said they also aspire to more analytics-driven AI applications, such as cybersecurity for fraud detection and data loss prevention, in the near future.

At the same time, enterprises indicated that they are scrambling to overcome outstanding issues around IT infrastructure, skillsets and data governance to deliver on their aggressive GenAI goals.

They cited AI-IT infrastructure integration difficulties (54%) and skills scarcity (52%) as posing the biggest barriers to scaling GenAI applications. Most (94%) agreed that cloud-native architectures, in combination with containerization tools, have become the “gold standard” for deploying and supporting GenAI and other modern AI applications at scale.

Cloud Native Technology Uptake

When it comes to GenAI apps specifically, 70% of respondents said they have containerized or intend to containerize them, the highest percentage among all application categories.

Why cloud native and container technologies?

“For AI developers, cloud native technologies deliver the ability to package models and dependencies into containers and deploy them seamlessly via orchestrators like Kubernetes,” said Dan Ciruli, senior director of cloud native product management at Nutanix. 

“This supports portability across environments, from on-premises data centers to public clouds.”

For data scientists and machine learning engineers, said Ciruli, “cloud-native tooling simplifies building machine learning pipelines and putting models into production. On the infrastructure side, using Kubernetes for dynamic resource management helps optimize expensive AI workloads.”

Software and Hardware Mandates

With nearly everyone in agreement about cloud-native and containerized architectures, what’s the infrastructure integration hitch?

There are both software and hardware issues to consider, according to Nutanix experts.

81% feel their current IT infrastructure requires improvement

Ciruli explained that tools for deploying and managing the new environment are still maturing. Containerizing applications, he explained, requires software to be written differently than in the past—unlike the shift to virtual machines (VMs) from independent, application-specific servers about two decades ago.

“Now, you’ll be dealing with some apps on VMs and some on containers at the same time,” he said. 

“Enterprises need tools to manage both, ideally together. Along with the challenge of a new type of application running on the infrastructure [GenAI and other AI], the tools for operating them need to evolve.”

Enterprise IT infrastructures also require the raw horsepower to accommodate the compute-intensive GenAI/AI applications being deployed, said Debojyoti "Debo" Dutta, Nutanix’s chief AI officer.

“They need a GPU-accelerated infrastructure with awesome data governance to run AI applications and high-performance, well-managed data platforms,” Dutta said. 

“The reason is that AI will actually operate on your enterprise data, and you’ll need to have full control over it.”

Related Nutanix Builds GenAI App to Empower Sales Team
Nutanix built SalesGPT, the second home-grown GenAI application to improve productivity, is helping its sales teams find answers to complex policy and process questions, reducing response times from days to mere minutes.

February 10, 2025

Dutta added that it's imperative that enterprises understand the cost governance implications of these changes, in part through the use of cloud cost-management tools and practices.

Fresh Security and Governance Considerations

Unwaveringly, cybersecurity-related concerns have topped ECI respondent challenge lists for each of the seven years the study has been conducted. GenAI and AI, however, introduce new concerns, explained Dutta.

“They expand data protection, privacy and integrity efforts to the GenAI large language models (LLMs),” he said.

The LLMs at the heart of GenAI are trained and tuned on massive amounts of data to enable a wide variety of automated functions: software code generation; the creation, summarization, translation and classification of text; and virtual agents/chatbots that can answer questions, to name a few.

“Once you train and fine tune a model with your own data, your model needs governance,” Dutta said. 

“Without the model governance, it can be used to reverse engineer your data in many cases,” which has security ramifications. “So extending corporate governance to [LLMs] is a new factor enterprises must address.”

Related Bridging the Gap Between AI’s Promise and Fulfillment
DataRobot CEO Debanjan Saha explains the state of enterprise AI and the challenges of moving beyond the hype to achieve business impact.

February 6, 2025

When it comes to model safety, he advised, “LLMs should be treated as first-class corporate [intellectual property], especially if they're trained or fine-tuned with the organization’s proprietary data. We are very early in the process of creating robust systems for model safety and governance. That includes putting the right guardrails on the models so they don’t violate societal norms.”

He noted that significant regulation is currently being crafted to figure out how LLMs can be used safely beyond the walls of a given enterprise. 

“We need to track this space very well because it is evolving.”

Finding the Right AI and Cloud Native Technology Skills

About half of ECI respondents (52%) said it was important for their organizations to invest in IT training to support GenAI, while 48% thought their companies would have to hire new IT talent.

Dutta pointed out that “AI skills mean different things for different people. For technical people like software engineers, that could mean learning how to use AI to write software better. If you're not a computer scientist, you still need to learn AI to do your current job better. That involves techniques like prompt engineering,” which involves honing AI model inputs to return optimum outputs.

On the foundational cloud-native side of the equation, said Ciruli, among the key skills enterprises require is knowledge of modern cloud-native concepts for secure network connectivity, storage, data security and assessing the health of GenAI/AI workloads.

“When apps are written to be deployed in cloud-native, there are new methods for handling [these functions],” he said. “They involve lots of new tools that differ from the traditional ecosystem tools that have been around for years,” he said.

“VMs always had the same IP address, while a container can move from place to place.” 

He said this affects how network connectivity and security must be managed, he said, requiring operators to embrace the concept of using the Kubernetes container orchestrator to manage distributed file systems.

Related AI and Cloud Native Spark Explosion of New Apps
Nutanix’s Dan Ciruli explains how the parallel paths of artificial intelligence and cloud native technologies are meeting the need for faster enterprise application development.

December 5, 2024

Where can enterprises find training?

Ciruli pointed out that for cloud-native skills improvement, the Cloud Native Computing Foundation (CNCF) offers accreditation curricula for individuals, as well as for vendors who want to teach those skills with their own program. 

“It’s a smart move by CNCF to also offer education programs for the entire industry,” he said. “The efforts are really paying dividends.”

Likewise, Dutta said there are abundant courses available for non-engineers to become proficient with GenAI, findable with a mere “Generative AI for everyone” Google search. 

“There are courses that specifically allow anybody to just pick up prompt engineering and start using the current AI models to be very productive,” Dutta said.

For engineers, he said, “If you look at any top-tier U.S. university today, you'll see a lot of introductory courses…that get very advanced very quickly. So the next generation of GenAI engineers is being minted as we speak to fill the gap.”

What’s Next for Cloud Native and AI-powered Enterprises?

At this juncture, organizations are eager to implement GenAI for productivity, automation and innovation gains with an eye toward more analytics-driven automation applications. However, IT infrastructure modernization, talent and new data security and governance hurdles are driving new investments in cloud-native, hardware-accelerated infrastructure and skills development. With the current speed of adoption, businesses don’t have the luxury of sitting back to see how things unfold before they act.

“Cloud native and AI technologies are likely the fastest accelerating technologies in history, happening around the world, in every major corporation,” said Ciruli. For this reason, he said, he now takes traditional discussions of five-year IT roadmaps with a grain of salt.

“Five years ago, where was AI on a 2025 roadmap? For most, it was nonexistent.”

Joanie Wexler is a veteran IT journalist and marketing writer who covers the gamut of emerging technologies, how to apply them and their business impact. She has been a Nutanix Forecast contributor since 2017.

Ken Kaplan contributed to this article. He is Editor in Chief for The Forecast by Nutanix. Find him on X @kenekaplan and LinkedIn.

© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles