Conclusion and Recommendations
The findings of the 7th Annual Nutanix Enterprise Cloud Index illuminate the profound shifts in enterprise IT strategies, driven by the rapid adoption of application containerization and Generative AI (GenAI) solutions. With over half of organizations fully containerizing their applications, this approach has emerged as the de facto standard, enabling scalability, portability, and agility across hybrid and multicloud environments. The symbiotic relationship between containerization and GenAI underscores the importance of adopting cloud native principles to meet the demands of these emerging workloads.
The survey results also highlight the dual challenges and opportunities posed by GenAI adoption. While organizations are eager to leverage GenAI for productivity, automation, and innovation, they also face critical hurdles in the form of data security, compliance, and IT infrastructure modernization. The widespread acknowledgment that organizations lack the skills development or robust governance to effectively launch and manage GenAI solutions signals a critical need for enterprises to align their technical initiatives with business objectives. Investments in IT training, talent acquisition, and advanced infrastructure are imperative to unlock the full potential of GenAI while addressing its inherent complexities.
Ultimately, the report underscores the need for a holistic approach to application and infrastructure modernization. By prioritizing security, fostering talent, and embracing orchestration platforms like Kubernetes, organizations can effectively navigate the challenges of scaling GenAI workloads from development to production. As enterprises continue to refine their strategies, the next frontier of innovation will be defined by their ability to integrate GenAI seamlessly into their broader IT ecosystems.
See How Easily You Can Deploy and Secure GenAI Apps.
The Rise of Generative AI Applications
Within the Enterprise
This year's ECI report focused specifically on gaining a better understanding of GenAI solution adoption, investment priorities, and key benefits realized by organizations today.
Survey results show that the vast majority of responding organizations (85%) already have a GenAI strategy in place, although the level of implementation of that strategy varies. Meanwhile, just 2% of organizations reported not having started planning their GenAI strategy.
Current State of Enterprise GenAI Strategy Development/Implementation
We have a strategy in place and are actively implementing it
We have a strategy in place but have
not yet started implementation
We are in the early stages of developing our strategy
We have not started planning our strategy yet
We aren’t going to develop a strategy
Figure 4: Current State of Enterprise GenAI Strategy Development/Implementation
of organizations already have
a GenAI strategy in place
Although most organizations reported having an enterprise GenAI strategy in place, it may not always seem obvious how GenAI supports the organization’s higher-level business goals or strategies. It is critical that organizations stay away from the attraction of “doing AI for AI’s sake.” AI initiatives and solutions should always be tied to larger business goals with measurable metrics so that the success (and ultimately ROI) of AI solution implementation can be measured and evaluated over time.
Respondents from this year’s ECI survey cited increased productivity, automation, efficiency, and innovation as the top business-related goals and strategies supported by GenAI. Just 1% of respondents said GenAI does not/could not support their organization’s overarching business goals and strategies.
Top Business Goals and Strategies Supported by GenAI
Increasing productivity
Increasing automation and efficiency
Increasing innovation
Customer retention and support
Decreasing operational costs
Employee onboarding
Industry differentiation
GenAI cannot support our business goals
Figure 5: Top Business Goals and Strategies Supported by GenAI
In terms of GenAI applications and workloads deployment, the majority of respondents (53%) say they are leveraging GenAI-based customer support and experience solutions (i.e., to improve customer support chatbots; to analyze customer feedback; personalize the customer experience; identity verification, etc.). However, this might change, as respondents indicate that over the next 1-3 years, their focus will shift to implementation of GenAI-based cybersecurity, fraud detection and loss prevention solutions.
GenAI Workloads Leveraged Today and in Next 1-3 Years
Customer support and experience
Cybersecurity and fraud detection
Content generation
Code generation and code co-pilots
Other
We aren’t leveraging GenAI or plan to
Today
Next 1-3 Years
Figure 6: GenAI Workloads Leveraged Today and in Next 1-3 Years
Finally, it is critical to remember that GenAI solutions and workflows are not implemented in a vacuum. The effectiveness of enterprise GenAI-based solutions depends on seamless integration into the wider IT ecosystem including the enterprise-grade resiliency, security and day 2 operations used for other business-critical workloads, and reliable, secure access to a range of data sources. Therefore, the success of GenAI-based workflows requires investment in a range of supporting technologies and services (e.g., containerization). More than half of the respondents in this year’s survey (54% and 52% respectively) say their organization’s IT infrastructure and IT training require additional investment to support GenAI applications and workloads:
Supporting Areas Requiring Investments to Improve GenAI Applications
IT Infrastructure
IT training
Cybersecurity
IT talent hiring
Data management
Data governance
Application development
Figure 7: Supporting Areas Requiring Investments to Improve GenAI Applications
Spotlight: Are Organizations Getting their Return on Investment When it Comes to GenAI?
Media hype and commercial availability of AI tools and services have driven enterprise interest in AI technologies to new heights over the past two years. The 2023 Nutanix State of Enterprise AI Report explored the impact of AI’s “honeymoon” phase, and the fact that many early adopters of AI solutions will experience fewer budget-related roadblocks when it comes to technology evaluation and implementation.
This year, we learned that 90% of respondents expect their IT costs to rise due to GenAI and modern application implementation. Eventually this honeymoon phase will fade, and spending/budgeting expectations for AI projects and technologies will be required to come in line with the rest of the IT portfolio.
To better understand trends around GenAI spending and business results, respondents were asked specifically about their organizations’ expectations for ROI when it comes to GenAI projects. More than half of organizations (52%) said that cost of ownership and ROI visibility will be a challenge associated with scaling GenAI workloads from development into production.
When asked specifically to gauge ROI results over the coming years, respondents estimated the following:
expect to break even or make a loss on GenAI projects over the next year
expect to break even or make a loss on GenAI projects over the next 1-3 years
In other words, only 56% of organizations expect to achieve any return on their investment from GenAI projects within the next 12 months. But with 70% of organizations expecting to make a return on their investment over the subsequent two to three years, we can confidently assume that most organizations will be expecting (and budgeting for) some sort of GenAI payoff in the 2026/2027 timeframe. This is still quite a generous timetable for ROI results from an IT solutions perspective.
expect to make a return on their
investment from GenAI over the
next 2–3 years
This begs the question: are expectations around GenAI ROI different compared to other IT solutions? Survey results this year uncovered an additional nuance to help address this question regarding ROI perceptions. Respondents were asked to identify which job role has ultimate responsibility for GenAI implementation and budgets at their organization:
Which job role within your organization has the ultimate responsibility for
GenAI implementation resources and budgets?
Chief Executive Officer
Chief Information Officer
Chief Technology Officer
Chief Data Officer
Chief AI Officer
Chief Financial Officer
Chief Innovation Officer
Chief Operating Officer
Cross-function GenAI team
GenAI resources are handled
within each line of business
The CFO ranks 6th for decision-maker responsibility when it comes to GenAI implementation – below the CTO, CIO, CEO, CDO, and CAIO. This result gives us further insight into organizational decision-making processes when it comes to GenAI projects. Financial implications may be important during evaluation, but the ultimate decision to implement GenAI projects may not be primarily a financial one, as supported by this rank order.
Finally, on a more positive note, only 2% of respondents in this year’s ECI survey said they “struggle to measure ROI of their organization’s GenAI projects over the long term (1-3 years).” The bottom line here is the vast majority of organizations are at least implementing GenAI projects with ROI goals/measurements in mind, and ensuring they are gathering the right metrics needed to make long-term evaluations regarding the value of these solutions. This means that regardless of the ROI timetable, organizations will have the data they need to inform financial decisions regarding GenAI projects when the need arises.
From Development to Production – Insights into
GenAI Workload Implementation and Lifecycles
Like any modern or cloud native application stack, developers need fast, easy access to resources and services on a reliable, scalable infrastructure. This includes elements of both the Infrastructure-as-a-Service (IaaS) ecosystem (e.g., compute, storage, networking), as well as the Platform-as-a-Service (PaaS) ecosystem (e.g., AI inference endpoint services, dev/test tools, serverless functions, multicloud orchestration tools, etc.).
While the majority of containerized applications are born in a cloud, many may need to be moved to a different public, private, edge or managed cloud due to cost, governance, or other reasons. However, very few are actually multicloud and built to be moved easily from one cloud to another (e.g., utilize similar IaaS and PaaS resources in AWS as they would in Azure or Google), or on-premises. Many stateful cloud native applications also lack consistent data services to store, protect, and process data across clouds. These complexities are just a few of the reasons why building truly portable, multicloud applications is such a challenge for modern IT organizations.
Emerging GenAI applications are not immune to these infrastructure challenges. Results from this year’s ECI survey indicate that almost all respondents (98%) face challenges when it comes to scaling GenAI workloads from development to production. In fact, the #1 challenge organizations face when scaling GenAI workloads from development into production is integration with existing IT infrastructure.
Challenges Faced When Scaling GenAI Workloads from Development to Production
Integration with existing IT infrastructure
Integration with existing IT infrastructure
Cost of ownership/Return On Investment (ROI) visibility
Regulatory and compliance hurdles
Limited computational resources
We have not faced challenges
Figure 8: Challenges Faced When Scaling GenAI Workloads from Development to Production
98%
of organizations face challenges when scaling GenAI workloads from development to production
In addition to the infrastructure-related challenges that come with GenAI workload implementation, GenAI applications themselves present unique challenges associated with data processing, model development, training, and maintenance. 79% of respondents say they plan to have processes or tools in place to manage the lifecycle of GenAI models from development to deployment and the number is likely a lot lower today. This should be considered a focus area for immediate improvement when organizations consider the tools and processes needed to make GenAI projects a long-term success.
Processes and Tools Organizations Plan to Put in Place to Manage the Lifecycle of GenAI Models from Development to Deployment and Maintenance
Third-party Machine learning operations (MLOps) platforms
In-house developed tools and processes
Integrated tools within our cloud service provider
We don’t/will not have any processes or tools in place to manage the lifecycle of GenAI models
Figure 9: Processes and Tools Organizations Plan to Put in Place to Manage the Lifecycle of GenAI Models from Development to Deployment and Maintenance
Top Challenge
when scaling GenAI workloads from development into
production is integration with existing IT infrastructure
Model accuracy is another unique aspect of the GenAI workflow that requires consideration. Almost all organizations (99%) say they plan to monitor and optimize their GenAI models in production to ensure performance and accuracy – with the majority of respondents saying they will perform continuous monitoring using automated tools. Clearly, organizations see the value in committing to this task over an extended period.
Approaches to Monitoring and Optimizing GenAI Models in Production
Continuous monitoring using automated tools
Manual performance checks
A combination of both
Figure 10: Approaches to Monitoring and Optimizing GenAI Models in Production
A final element of consideration when thinking about GenAI workload implementation and lifecycles is hardware support. Many GenAI workloads will require specialized compute (GPUs, APUs, TPUs, etc.) to perform tasks. Organizations can either utilize these computing resources as a cloud service, or purchase and integrate the necessary hardware components into their own data centers and inference at the edge. Supply constraints, networking connectivity and accessibility, performance, budget as well as data privacy, compliance and locality will all come into consideration when organizations think about hardware requirements for GenAI. Our survey results show only 60% of organizations have a concrete plan when it comes to GenAI-specific hardware. The other 40% are still investigating the use of hardware specific to GenAI, or considering how to start that process:
Plans to Use GenAI-Specific Hardware
Fully dependent on GenAI-specific hardware
Using GenAI-specific hardware for critical workloads only
Investigating the use of GenAI-specific hardware
Investigating the use of GenAI-specific hardware in the next 12 months
Investigating the use of GenAI-specific hardware beyond the next 12 months
We have no plans to use GenAI-specific hardware
Figure 11: Plans to Use GenAI-Specific Hardware
Spotlight: Your DevOps and Engineering Teams are Already Leading the GenAI Charge
This year’s survey highlights some interesting differences in the way GenAI challenges and implementation are perceived among IT decision makers compared to platform engineering and DevOps decision makers.
When asked how challenging, if at all, organizations find GenAI adoption, just 58% of platform engineering and DevOps respondents said they perceive this as a “challenge.” On the other hand, 72% of IT decision makers said GenAI adoption is a “challenge” for their organization. Quite the difference in opinion.
Digging deeper, we also found notable differences in the perception of GenAI strategy and implementation among these respondent types, which may help explain the challenge dichotomy:
How far along is your organization with its GenAI strategy?
We have a strategy in place and are actively implementing it
We have a strategy in place but have not yet started implementation
We are in the early stages of developing our strategy
IT Decision Makers
Platform engineering and DevOps Decision Makers
There is a 10 percentage-point difference in the perceived level of GenAI implementation among these decision-maker groups. Platform engineers and DevOps report being ahead of their IT counterparts when it comes to implementation. This may be because of differences in perception regarding the overall challenge of strategic implementation. While IT decision makers are concerned with holistic implementation of tools and strategy, platform engineering and DevOps decision makers may be running with solutions in a more bespoke or trial fashion but consider this level of implementation “good enough” for their team’s needs.
Regardless of these differences in perception, the result gives us some key insight into where GenAI applications and workloads are actually being implemented and by whom. Our data suggests that in some cases, platform engineering and DevOps teams may already be off and running with GenAI implementations despite the knowledge or support of their IT colleagues.
New Applications Are Driving a New Industry Standard
The 7th Annual Nutanix Enterprise Cloud index (ECI) Report highlights an important industry shift: more than half of organizations (54%) report that all of their applications are now containerized. This is driven, in part, by cloud-only organizations who are running all their applications in one or more public cloud, 66% of whom report that all of their applications are containerized, and that account for 24% of ECI respondents. However, this is likely reflective of applications built-in house by organizations, rather than off-the-shelf enterprise applications.
This shift toward containers is driven by the development of new applications, and accelerated by AI, which are most often built on containers. For this reason, we can expect this trend towards containerization to continue over the coming years. This is supported by the fact that 98% of organizations surveyed say they are at least in the process of containerizing applications – including both legacy and newly developed applications (Figure 1).
Current State of Application Containerization
All applications are containerized, both legacy and newly developed
Just our newly developed applications
Just our legacy applications
We are in the process of containerizing our applications
We have no intent to containerize our applications
Figure 1: Current State of Application Containerization
For some readers, this finding may beg the question: what is application containerization, and why has it become so pervasive across modern IT environments? In the broadest terms, containerization is an approach to software engineering that involves packaging up code and all its dependencies so the application installs quickly and reliably in any computing environment. A container can be deployed to a private datacenter, public cloud, or an edge location—regardless of technology platform or vendor.
Modern application development methods, particularly for “cloud native” environments, increasingly favor containerization because of several key benefits:
- Containerization allows applications to be abstracted from whatever OS is hosting it—becoming portable in the process.
- Containerization helps abstract software from its runtime environment by making it easy to share CPU, memory, storage, and network resources.
- Containers are isolated in their host environment, so they are less vulnerable to being compromised by malicious code.
- Most containers comprise fewer software resources, run more quickly than traditional alternatives, and use system resources more efficiently.
- Effective management and orchestration allow containers to be more agile – and deployed more quickly than traditional software.
feel their current IT infrastructure requires improvement to fully support cloud native applications and containers
But just because most of today’s applications, both legacy and new, are being containerized, doesn’t mean organizations no longer struggle with elements like data management, governance, and security. We asked respondents about the challenges they face regarding application containerization and container management, and the results highlight persistent challenges across several key areas (Figure 2).
Key Challenges Organizations Face Regarding Application Containerization
feel their current IT infrastructure requires improvement to fully support cloud native applications and containers
find cloud native and container application development challenging
feel they do not have all of the necessary skills needed to support cloud native applications and containers
Figure 2: Key Challenges Organizations Face Regarding Application Containerization
Clearly, the shift towards application containerization comes with a host of challenges for IT organizations to address, including developing and splitting an application into a set of services running in containers, as well as orchestrating a bundle of containers. Orchestration solutions can help address many of the data-related challenges associated with container-based management, security, persistence, and performance.
Spotlight: The AI Application Development Boom Will Catalyze a New Wave of Containerization
Over the past two years, development and adoption of Artificial Intelligence (AI) based solutions became a ubiquitous enterprise goal. However, we're still in the very early stages of AI solution adoption. Many organizations are working to identify the right workloads and use cases, determining best fit, and understanding budget implications associated with new solution development and deployment. Increasingly, organizations are turning to microservices, an architectural approach that structures applications as a collection of small, independent services, to build scalable and adaptable AI solutions.
Containerization is often a key enabler for building, testing, and iterating quickly on these new AI-based solutions and services – especially applications that leverage Generative AI. This is because these applications require complex dependencies, including accelerated compute as well as scaling for lifecycle management, that are simplified by containerization. Containers provide the lightweight and portable runtime environment required for deploying microservices at scale in the cloud, and quickly moving those services, if needed, to a particular legacy application or data set that may be siloed in another environment.
In other words, containerization, cloud native applications, and AI solution development have all become closely intertwined. The result: 70% of respondents in this year’s survey say they will containerize their GenAI applications – the highest among all application categories. We predict a continued boom in containerization driven by GenAI-based application development and deployment over the coming years.
What types of applications are containerized by your organization?
GenAI applications
Development/test applications
Enterprise-critical applications (non-database)
Databases
Container orchestration involves a set of automated processes by which containers are deployed, networked, scaled, and managed. The main container orchestration platform used today is Kubernetes, which is an open-source platform that serves as the basis for many of today’s container orchestration solutions and services. Functionality provided by orchestration platforms like Kubernetes can help address many of the challenges cited above. Our survey results indicate these orchestration solutions are widely deployed today, with 98% of organizations saying they already use some type of Kubernetes environment. Notably, nearly 80% are using more than one Kubernetes environment, with most using 2 or 3 different environments (Figure 3).
Number of Kubernetes Environments Deployed Within an Organization Today
One
Two
Three
Four
Five
Six
Don't Use
Don't Know
Figure 3: Number of Kubernetes Environments Deployed Within an Organization Today
Complementing these orchestration solutions is the need for continued infrastructure modernization to support container-based application development and deployment. Over 80% of respondents in this year’s ECI survey said their current IT infrastructure requires some level of improvement to support cloud native applications and containers. With infrastructure modernization comes improved data and application mobility, security, compliance, performance, resiliency as well as simplified operations—all critical elements required to support demanding enterprise workloads and the growing complexity associated with orchestrating these workloads across hybrid multicloud environments.
Spotlight: C-level Expectations vs. Reality Regarding Container Deployment and Support
Survey results highlight some key differences in the way C-level respondents perceive aspects of application containerization within their organization versus all other seniority levels. The biggest problem? C-level decision makers may be perceiving a higher level of application containerization within their organization than exists in reality:
What best describes the status of your organization’s use of containers for applications today?
All applications are containerized, both legacy
and newly developed
Just our newly developed applications
Just our legacy applications
We are in the process of containerizing our applications
We have no intent to containerize our applications
C-level
Other Seniority
A 12-percentage point change between C-level respondents vs. other seniority is a big difference. This difference in understanding of application containerization overall may stem from perceptions regarding people skills/support. It doesn’t help that C-level respondents are more secure in their belief that their organizations have all the necessary skills needed to support cloud native apps/containers:
Does your organization have the necessary skills to support cloud native applications/containers?
We have all the necessary skills we need
C-level
Other Seniority
C-level decision makers may be overlooking a critical skills gap, one which may be necessary to fill in order to get their organization to actually meet what their perceived level of overall application containerization should be.
Key Findings
1. Application containerization is the new infrastructure standard.
Nearly 90% of organizations report that at least some of their applications are now containerized, and this number is expected to grow with the rapid adoption of new application workloads like GenAI. Simply put, 94% of respondents agree that their organization benefits from adopting cloud native applications/containers. This approach to infrastructure and application development should be considered the gold standard for delivering seamless, secure access to data across hybrid and multicloud environments.
2. GenAI application adoption and implementation continues at a rapid pace.
Over 80% of organizations have already implemented a GenAI strategy with only 2% of organizations admitting that they HAVE NOT started planning their GenAI strategy. That said, implementation targets vary significantly. Most organizations believe GenAI solutions will help improve their organization’s levels of productivity, automation, and efficiency. Meanwhile, real-world GenAI use cases gravitate towards customer support and experience solutions today. However, organizations aspire to apply GenAI solutions to cybersecurity and data protection workloads in the near future.
3. GenAI adoption will challenge traditional norms for data security and privacy.
95% of respondents agree that GenAI is changing their organization’s priorities, with security and privacy being a primary concern. Over 90% of organizations say data privacy is a priority for their organization when implementing GenAI solutions. Clearly, organizations understand that security and privacy are critical components of GenAI success. However, a staggering 95% of respondents still believe their organization could be doing more to secure its GenAI models and applications. Security and privacy will remain a major challenge for organizations as they seek to justify the use of emerging, GenAI-based solutions and ensure that they adhere to traditional security norms, as well as new requirements for data governance, privacy, and visibility.
4. Infrastructure modernization to support GenAI at enterprise scale.
Running cloud native applications at enterprise scale requires an infrastructure that can support the necessary requirements including security, data integrity and resilience. Emerging GenAI applications are no exception to this rule. Almost all respondents (98%) face challenges when it comes to scaling GenAI workloads from development to production. In fact, the #1 challenge organizations face when scaling GenAI workloads from development into production is integration with existing IT infrastructure. As a result, IT Infrastructure was chosen as the #1 area of investment needed to support GenAI.
5. GenAI solution adoption requires changes to technology AND people.
52% of respondents say their organization needs to invest in IT training to support GenAI. Similarly, 48% of respondents believe their organization needs to hire new IT talent to support GenAI. There is no denying organizations face acute skills shortages and competition for GenAI-related talent. The good news? Many teams will embrace the challenge to adopt AI-related competencies and skills organically, as part of normal work. This year’s survey shows that 53% of respondents believe advancements in GenAI will provide them with an opportunity to become an AI expert.
For the seventh consecutive year, Nutanix commissioned a global research study to learn about the state of global enterprise cloud deployments, application containerization trends, and GenAI application adoption. In the Fall of 2024, U.K. researcher Vanson Bourne surveyed 1,500 IT and DevOps and Platform Engineering self-reported C-level and other decision-makers around the world. The respondent base spanned multiple industries, business sizes, and geographies, including North and South America; Europe, the Middle East and Africa (EMEA); and Asia-Pacific-Japan (APJ) region.
The findings of the 7th Annual Enterprise Cloud Index (ECI) reveal key trends and decision making preferences regarding Application Containerization, Kubernetes adoption, and GenAI solution implementation. This year’s results also explore some of the key benefits and challenges organizations are beginning to experience when it comes to emerging GenAI workflows—focusing on elements like data security, compliance, and associated requirements for infrastructure modernization.
©2025 Nutanix, Inc. All rights reserved. Nutanix, the Nutanix logo and all Nutanix product and service names mentioned are registered trademarks or trademarks of Nutanix, Inc. in the United States and other countries. All other brand names mentioned are for identification purposes only and may be the trademarks of their respective holder(s).
Certain information contained in this content may link or refer to, or be based on, studies, publications, surveys, and other data obtained from third-party sources and our own internal estimates and research. While we believe these third-party studies, publications, surveys, and other data are reliable as of the date of publication, they have not independently verified unless specifically stated, and we make no representation as to the adequacy, fairness, accuracy, or completeness of any information obtained from a third-party. Our decision to publish, link to or reference third party data should not be considered an endorsement of any such content.
Key Differences in Regional Results
Trends across geographic regions (North America, EMEA, APJ) tend to track similarly to one another and the global average. However, there are some key regional differences to note, which are highlighted in the thematic summary and tables below:
Key Regional Differences: the state of application containerization. All global regions show similar rates of “all applications are containerized”, in the 52%-55% range. Notable is the relatively high rate of APJ respondents who said they are still in the process of containerizing applications (16%), compared to other regions. An indication that APJ may be lagging slightly behind its EMEA and Americas counterparts when it comes to application containerization
Current State of Application Containerization
All applications are containerized,
both legacy and newly developed
Just our newly developed applications
Just our legacy applications
We are in the process of containerizing our applications
We have no intent to containerize our applications
Americas
EMEA
APJ
Figure 14: Current State of Application Containerization
Key Regional Differences: current state of GenAI strategy & implementation. APJ has the highest proportion of respondents saying they have a GenAI strategy in place and are actively implementing it, compared to other global regions. Surprisingly, the Americas is the GenAI laggard comparatively, driven by a relatively high rate of regional respondents stating that they have not started planning their GenAI strategy. Digging deeper into the country results, this was driven by respondents in the US, 8% of whom chose this answer, while respondents in Mexico and Brazil were both under 1%.
Current State of Enterprise GenAI Strategy Implementation
We have a strategy in place and are
actively implementing it
We have a strategy in place but have not
yet started implementation
We are in the early stages of developing our strategy
We have not started planning our strategy yet
We have not started planning our strategy yet
Americas
EMEA
APJ
Figure 15: Current State of Enterprise GenAI Strategy Implementation
Key Regional Differences: top business goals and strategies for GenAI. Increasing productivity is the top business goal across all three global regions. Beyond the top result, there are some minor deviations. For example, respondents in the Americas chose increased innovation as their 2nd most common priority, whereas EMEA and APJ chose increased automation and efficiency as their 2nd most common priority.
Top Business Goals and Strategies Supported by GenAI
Increasing productivity
Increasing automation and efficiency
Increasing innovation
Customer retention and support
Decreasing operational costs
Employee onboarding
Industry differentiation
GenAI does not/could not support our overarching business goals and strategies
Americas
EMEA
APJ
Figure 16: Top Business Goals and Strategies Supported by GenAI
Key Regional Differences: challenges scaling GenAI workloads from dev to prod. Respondents in the Americas and APJ see IT infrastructure integration as their top challenge, while EMEA respondents believe lack of skills is their #1 challenge. Notable is the low incidence rate of “limited computational resources” in EMEA compared to other regions, which may be an indication of a more favorable supply environment for things like GPU and accelerated compute hardware and services within the region.
Challenges Faced When Scaling GenAI Workloads from Development to Production
Integration with existing IT infrastructure
Lack of skills needed to deploy and operate AI
Cost of ownership and return on investment visibility
Regulatory and compliance hurdles
Limited computational resources
We have not faced challenges when scaling GenAI workloads
Americas
EMEA
APJ
Figure 17: Current State of Enterprise GenAI Strategy Implementation
Key Regional Differences: hardware-specific plans for GenAI. Respondents in the Americas indicate the highest dependency on GenAI-specific hardware. All regions show a similar rate of respondents who are still “investigating or planning to start investigating” use of GenAI-specific hardware. There is no clear regional laggard, indicating hardware strategy for GenAI seems to be an area where organizations are still learning and planning, globally.
Plans to Use GenAI-Specific Hardware
Fully dependent on GenAI-specific hardware
Using GenAI-specific hardware for critical workloads only
Investigating the use of GenAI-specific hardware
Investigating the use of GenAI-specific hardware in the next 12 months
Investigating the use of GenAI-specific hardware beyond the next 12 months
We have no plans to use GenAI-specific hardware
Americas
EMEA
APJ
Figure 18: Plans to Use GenAI-Specific Hardware
Understanding GenAI’s Impact on Data Security Strategies and Related Skillsets
GenAI applications and services have a symbiotic relationship with their underlying datasets, models, and infrastructure. Enterprises are acutely aware of this relationship, and the importance of developing robust data security and infrastructure scalability strategies in tandem with one another in order to effectively support complex GenAI workflows. When asked to rank in order of importance the data-related aspects of GenAI workload implementation, this year’s respondents chose data privacy/security as the #1 most important data-related aspect of GenAI application/workload implementation, followed by performance and scalability.
Most Important Data-Related Aspects of GenAI Application Implementation
Privacy and security
Performance
Scalability
Governance
They are of equal importance
Figure 12: Most Important Data-Related Aspects of GenAI Application Implementation
Note: showing responses ranked first
Further highlighting this need for data security, privacy, and scalability when it comes to GenAI implementation is the fact that over 95% of respondents in this year’s ECI survey agreed that data privacy is a priority for their organization when implementing GenAI. Finally, over 90% of respondents agreed that having one platform to run all of their applications across data centers, clouds, and the edge would be valuable to achieve their GenAI initiatives.
95%
believe that data privacy is a priority for their organization when implementing GenAI
Despite being an area of critical importance when it comes to GenAI implementation and success, data privacy/security also remains a targeted area for continuous improvements, with 95% of respondents saying their organization could be doing more to secure its GenAI models and applications. This perceived need for organizations to “do more” to secure GenAI models and applications may also be acting as one of the major challenges and potential hindrances associated with expanding utilization of GenAI workloads and applications today:
Challenges to Expanding the Utilization of GenAI Workloads Today
Privacy and security concerns of using LLMs with sensitive company data
Complexity and lack of expertise to build a GenAI environment from scratch
Lack of use cases for utilizing GenAI
All of the above are/ could be of equal challenge
None of the above are/ could be challenges
Other
Figure 13: Challenges to Expanding the Utilization of GenAI Workloads Today
Note: showing responses ranked first
In addition to privacy and security concerns, many organizations also ranked complexity and lack of experience building GenAI environments from scratch as a top challenge. This lack of GenAI experience and skillsets is likely a key contributor to the fact that 68% of respondents believe that GenAI adoption is a challenge for their organization.
95%
said their organization could be doing more to secure GenAI models and applications
Spotlight: How do I Know if My Organization Has the Skills Necessary to Implement GenAI Applications and Workloads?
The 2023 Nutanix State of Enterprise AI Report showed that almost all organizations needed more AI skills across a range of related areas. We expect there to be significant competition over a finite pool of support and development resources for solutions like GenAI, data science and analytics, research and development, platform engineering, and prompt engineering. This year, we learned that 48% of respondents believe their organization needs to invest in IT talent hiring to support GenAI.
Diving deeper into the theme of IT and GenAI talent, this year’s report asked specifically about how organizations are competing and hiring to fill talent gaps:
are still hiring to gain skills necessary to support GenAI adoption
have all the necessary skills to support GenAI and are not currently hiring
If you believe your organization has a skills shortage when it comes to AI solution adoption, you aren’t alone. The majority of organizations are still hiring, training, or investing in staff to improve their overall GenAI skills pool, according to this year’s survey results.
However, an acute lack of skills does not necessarily need to be a hindrance. The State of Enterprise AI Report showed that 85% of respondents plan to purchase existing AI models or leverage existing open-source AI models in order to build their AI applications. Only 10% of respondents said they plan to build their own models. In other words, many organizations will simply spend their way around short-term AI skills gaps.
Another important consideration is that not all AI skillsets need to come from outside the organization or via hired talent, which can be a time-consuming process. Many teams will quickly adopt AI-related competencies and skills as part of normal work, adding to the capabilities of the organization organically. This year’s survey shows that 53% of respondents believe advancements in GenAI will provide them with an opportunity to become an AI expert. In other words, many employees are ready to develop the new skills necessary to fully take advantage of these new solutions, and might also be looking for technology solutions that can support them in this effort.
Overall, survey results indicate that GenAI solution adoption and deployment will necessitate a more comprehensive approach to both enterprise data security and the people-skills needed to support it. Respondents indicate a significant amount of work needs to be done to improve the foundational levels of data security/governance required to support GenAI solution implementation and success. For example:
- 95% of respondents believe GenAI is changing priorities for organizations, with security and privacy becoming higher priorities
- 83% of respondents say they are concerned about the use of GenAI and its impact on data security in the broader IT vendor supply chain
- Half of respondents (50%) believe their organization needs to invest more in cybersecurity to support GenAI
- Only 62% of respondents believe they have the necessary skills needed to support security and ransomware protection
- Over 65% of respondents note that data governance, security and ransomware protection, and data privacy remain challenging areas for their organization
These results should not dissuade organizations from embracing AI-based solutions or GenAI workloads. On the contrary, the data points can be used to help build the business imperative for the additional investments in people, processes, and technology needed to address novel data and security-related challenges associated with GenAI implementation.