Pivot Past the Enterprise AI and Cloud Native Hype

From managing infrastructure to grappling with the rising energy demands of AI applications, NAND Research Chief Analyst Steve McDowell covers the key issues IT leaders face.

By Tom Mangan

By Tom Mangan August 15, 2024

The shiny-objects phase of AI is moving to the rear-view mirror for enterprise IT leaders. 

“I think we're all getting AI fatigue,” Steve McDowell, chief analyst at NAND Research, told The Forecast in a wide-ranging interview recorded the 2024 .NEXT conference in Barcelona, Spain.

He said now it’s time to dive into the next phases of managing cloud native and other development environments, finding practical and effective AI uses cases and dealing with the deep implications of learning automation.

So many questions swirl around this topic: 

  • How should IT leaders adapt to combined cloud native and conventional environments? 

  • What about developing practical AI use cases that boost business outcomes? 

  • How will IT teams grapple with the energy requirements and ethical impacts?

McDowell has been thinking these issues through and zeroed in on three challenges IT leaders can’t afford to ignore:

  • Unifying infrastructure management

  • Driving value with artificial intelligence

  • Embracing sustainability and ethics

Unifying Infrastructure Management

AI hype has dovetailed with enthusiasm over cloud native environments like Kubernetes. After all, containerization and microservices support a long roster of data-intensive AI functions. Cloud native development also allows fast, secure, scalable deployments.

Nevertheless, many enterprise IT workloads run fine in conventional architectures—and will continue to do so over the next few years. 

“There's a million kinds of legacy workloads,” McDowell said. “It doesn't make sense to go rearchitect them.”

Adding cloud native technology in the enterprise ramps up the complexity for IT teams. Moreover, McDowell added, a solution like Kubernetes or OpenShift needs a bridge to hybrid, multicloud environments.

The solution lies in a unified management platform for hybrid multicloud. 

“They learn one tool for however they're going to deploy,” he said. “It's really about simplicity — about managing from a single pane of glass.”

Related

IT Leaders Get AI Ready and Go

Unifying the management of hybrid multicloud environments gives enterprises a foundation for tapping the business value of AI. 

Driving Value with Enterprise AI

McDowell sees an escalation of impatience with AI hype. 

“Every product briefing I go to, if it doesn't have the word ‘AI’ in it, I'm in the wrong room. It's all anybody wants, whether it's relevant or not.”

But now’s the time to pivot the conversation on AI. 

“Let's stop talking about GPUs,” he said. “Let's stop talking about who the providers are. Let's start talking about, ‘what are we going to do with it and how is it going to change my business?’”

Related

Seeing AI’s Impact on Enterprises

AI can help IT pros understand their infrastructure and analyze their log files. It can guide enterprises toward reinventing their customer experience. 

“Once you start talking about what the business use is, and the value you're trying to extract from any technology, that then is going to drive the technology decisions,” he added.  

Historically, creating AI software required massive investments in compute capacity and programming skill. That’s changing, with vendors offering economical AI apps like ChatGPT. 

“It doesn't have to be expensive to start AI,” McDowell said.  

Generative AI or GenAI, which uses large learning models (LLMs) to create text, images and other content, has been the principal firehose of AI hype. For now, IT pros are finding ways to make GenAI safe in enterprise environments. The next few years will be about deploying it at enterprise scale. That means partnering with a big provider like OpenAI or Anthropic, McDowell said, and fine-turning LLMs on corporate data.

GenAI uses GPUs and databases that enterprise IT pros haven’t seen before. 

“Which is why we're seeing a trend right now of packaging the pieces and saying, ‘here's how you do it, Mr. IT Guy’,” McDowell said. 

Related

The Amalgamation of AI and Hybrid Cloud

Still, questions abound because of uncertainties about deploying GenAI in complex hybrid environments. And then there are the practical business questions.

“How am I going to use AI to enable the next generation or the next iteration of digital transformation? It's going to change all our lives,” McDowell said. “But it's also, how do I use it to make my own IT operations more efficient?”

AI inference at the edge. AI has two primary computing phases: training the models and drawing inferences from training data. While training AI models typically requires vast computing resources, inference is a much lighter computational lift. McDowell suggests this creates an opportunity to exploit inference capability at the network edge, in mobile devices, and remote servers.

“As a business user, the value is in the inference,” he said. “The closer to the data that the AI is, the faster I'm going to get time to value.” 

Cameras, for instance, can collect real-time image data in cars or factory production lines. However, the real-time value is lost in the time it takes to upload image data to a remote AI engine, infer the data’s meaning, and then deliver practical intelligence to the user.   

Of course, IT teams must orchestrate these interactions, manage intricate architectures and use AI to automate and simplify everything. “It's not that different from how I'm doing hybrid multicloud,” McDowell said. “The boxes are just a lot smaller.” 

Embracing Sustainability and Ethics

Discussions of AI in corporate settings must address the data center’s impact on energy usage. 

“It's consuming up to 25% of the energy spent in an enterprise,” McDowell said. He sees these demands will only rise as enterprise AI enters the mainstream. 

Related

IT Sustainability Becomes a Business Imperative

Moving inference to the network edge could presumably reduce data center energy demands. But that won’t address the full breadth of AI-related sustainability issues. He pointed to a recent report stating that IT consumes around 3% of the world's power

“And it's growing exponentially,” he said.

Computers running GPUs might require almost twice as much electricity as a server built a few years ago, McDowell said. And the GPUs produce much more heat, so they must be physically separated, requiring more space in the data center. Some data centers cannot be built because there’s not enough electrical infrastructure to power them, he added.

“Now we're at the point where if I don't make these changes that support sustainability and reduce my power consumption and my air conditioning bill, it impacts my business,” McDowell said. 

Innovations in cooling technology could open up a new generation of processors that enable even more possibilities. But that doesn’t change the pressures on the current power grid.  

“We're running out of power,” he added. “And it's a lot faster and a lot more practical to shrink my electrical footprint than it is to go build a new power plant.”

Ethics and Responsible AI

McDowell said enterprises must answer probing questions about the human impacts of learning automation. 

“How biased is my data? Where do these models come from? What were they trained on? It's a hard problem to solve,” he added. “And, honestly, there's only a handful of companies trying to solve it.”

Related

Role of Open Source in the Future of AI

He said it’s one thing to use an LLM to help write and edit documents. 

“It's a whole other thing when I start using the output of an AI model to drive business decisions or engagements with users,” McDowell cautioned.

The only way to unmask these issues is to put the technology to work, monitor the results and keep improving. 

“Every transformational technology follows this path, although the timelines are getting much shorter,” McDowell concluded. 

“It's understand it. It's enable it. It's play with it — and then it's going to drive change.”

Tom Mangan is a contributing writer. He is a veteran B2B technology writer and editor, specializing in cloud computing and digital transformation. Contact him on his website or LinkedIn.

© 2024 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles