Videos

Bracing Data Centers for Wave of AI Workloads

In this video interview, Harmail Chatha, senior director of cloud computing operations at Nutanix, describes the growing challenges of managing data centers as business demands for enterprise AI applications climb.

April 25, 2025

Growth in new data center construction is dramatically spiking upward with the rapid expansion of artificial intelligence projects. According to data from the U.S. Census Bureau, spending on new data center facilities rose at a compounded annual rate of 40% between 2021 and early 2025, reaching an annual rate of approximately $34.8 billion in February 2025.

In response to a wave of enterprise AI innovation, data center managers must manage increasing power and cooling demands of GPU-based AI technologies. 

In this video interview with The Forecast, Harmail Chatha, senior director of cloud operations at Nutanix, explains how the recent wave of AI innovation is creating new challenges and opportunities for IT decision makers. He describes the shift in strategies for managing data centers that must take on a growing number of AI workloads in the foreseeable future.

Recorded at Nutanix’s 2024 .NEXT event in Barcelona, Chatha explains what’s needed to efficiently operatte data centers that use the next generation of CPUs and GPUs. He touches on the importance of measuring and managing emissions at a granular level, from data centers down to individual workloads and components. The ultimate goal is to make intelligent decisions about workload placement to optimize for sustainability and efficiency.

RELATED AI Reorients IT Operations
In this video interview, Harmail Chatha, senior director of cloud computing operations at Nutanix, talks about sustainability factors and automation capabilities of AI that are shaping IT strategies.

October 17, 2023

“We're really at the onset of designing our data center of the future,” Chatha said, describing his work for Nutanix. “[Soon] what's legacy is not going to work any longer. It's going to be super inefficient.”

In a previous Tech Barometer podcast, Chatcha explained why building data centers is his calling. He helped build and runs a state-of-the-art hyper-dense data center for Nutanix. His role also include IT sustainability initiatives, which are chronicled in Nutanix’s annual ESG Report.

Transcript:

Harmail Chatha: You have GPU clouds available like AWS Google and Azure, but they're kind of the true ias PaaS platforms. Now you actually have clouds that offer you bare metal with GPUs. And what's happening across the industry is there's a lot of companies that are deploying these in traditional data centers and taking up a lot of the power and space as well, because we've always had this concept of hyperdepth racks where we optimize for vertical growth versus horizontal growth. But what's happening within the data centers now is a lot of horizontal growth because of not enough power available, not enough infrastructure available to support the power needs of GPU environments. And obviously cooling isn't there as well. So I don't think AI is necessarily pushing the limits within data centers yet, but it's going to very, very soon if data centers don't start to adapt to new technologies, new cooling infrastructure, new power densities that are required as well. So I think we're going to start seeing the limitations within data centers, but I don't believe it's there yet. But as more and more consumption goes in and customers identify workloads that they're going to be running with ai, I definitely see it hitting a limit.

RELATED AI Lifecycle’s Impact on IT Infrastructure
In this Tech Barometer podcast interview with Induprakas Keri, senior vice president and general manager for hybrid multicloud at Nutanix, learn why hybrid cloud IT systems help enterprises evolve their use of AI applications and data.

March 14, 2025

They're at the heart of the power problem in the data centers right now as well, right? These newer generations of CPUs with GPUs are consuming anywhere from 30 to 50% more power within the servers. Hence, the industry is really taxed from a power consumption perspective. Whereas you can deploy a full rack of the older gear now you can only deploy half a rack. So how do you solve that problem? So now, whereas we historically have been 17.3 kilowatts per rack, fully maximizing the rack. Now our new design, it's going to be 34 plus kilowatts per rack with liquid cooling to the rack and ultimately getting to the chip as well. So we're really at the onset of designing our data center of the future also, because what's legacy is not going to work any longer. It's going to be super inefficient. Cooling challenges within the data center. Air cooling is not going to be enough with this new AI technology going in and the consumption of power within the GPUs as well.

RELATED Report Shows Enterprise AI Driving Big Investment Burst in Cloud Services
The Cloud Usage Report, based on data from customers using Nutanix IT infrastructure management software, shows that organizations are spending more on cloud services, especially to support emerging capabilities such as artificial intelligence, but smart tools and strategies are helping them manage costs.

February 19, 2025

Companies have to start really honing in or zooming into their environments. Not so much holistically at a data center level, but what does a workload look like and how do you measure the emissions of that workload in itself? Right? And we're just kind of touching the surface on scope one, scope two, how do you really measure scope three, which is the most challenging one? It's basically considered everything else that's not direct emissions, indirect emissions, but scope three being all encompassing. How do you get to embody emissions as well as an industry? We're not there yet embodied emissions of a server. So we're talking about VMs to workloads, but embodied emissions means what's that single little cable within the system, the server itself, and how do you measure the emissions of that? There's thousands, hundreds of thousands of parts that go into a server. How do suppliers measure the transportation cost and the development cost of those components as well? So really it's all about zooming in right now, right? As we continue to mature in this space, there's a lot of effort that's going to go into measuring, and there's so many companies, new startups coming out that are starting to just touch the surface of how do you measure emissions in itself?

RELATED Guiding Enterprise IT Hardware Buyers into the AI Future
In this Tech Barometer podcast, David Kanter, co-founder of ML Commons, talks about intellectual curiosity and how it led him to the forefront of the enterprise AI revolution.

April 23, 2025

ROI has a lot of interest from companies wanting to learn and understand sustainability. It's no longer like a hypothetical topic or a conversation, but kind of roll your sleeves up and you have to take the step, take the initiative to first kind of educate yourself, what is it that you need to do, understand the different scopes there are, and really start to measure what your footprint looks like. So what I'm seeing is, of course a lot of interest in the industry, but I think what's happening with some of the, for example, us who've been on this journey for the last three plus years now, or some of the more mature companies that are been measuring their carbon footprint, we've been doing this at a very holistic data center level, at a building level, and then kind of have gone down to a customer's environment level.

RELATED Get a Grip on Data Storage in Quest for Enterprise AI
In this video interview with The Forecast, Simon Robinson, principal analyst at Enterprise Strategy Group, discusses the complexities of managing data in cloud and hybrid multicloud environments, a challenge that is growing more acute with the rise of enterprise AI applications and data.

April 2, 2025

So we have data halls, we have cages, so we're able to measure our footprint there. But now, as we announced just earlier today in the opening keynote is worse within (Nutanix) Prism Central application, we can measure the electrical consumption of a node, and ultimately you can get to a cluster. So we're going a step deeper versus just being holistic at a data center level now. So this is a good step in the right direction, but where we ultimately need to go and continue to do more work is you got to get to the VM level. Once you can measure the vm, then you got to get to the workload level, and that's when you're going to be able to make smart and intelligent decisions on what a workload consumption looks like, correlate that back to the emission factor, and then intelligently you're able to move those applications around to more sustainable data centers that might have lower P use more renewable energy as well. So I think that's the journey we're on. I'm glad we're measuring at the node level, but ultimately we got to get to that VM and application level as well.

Jason Lopez is executive producer of Tech Barometer, the podcast outlet for The Forecast. He’s the founder of Connected Social Media. Previously, he was executive producer at PodTech and a reporter at NPR.

Ken Kaplan contributed to this video. He is Editor in Chief for The Forecast by Nutanix. Find him on X @kenekaplan.

© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles