Sometimes, the “next big thing” in technology turns out to be the real deal (think: the cloud). Other times, it merely sizzles and fizzles (think: the metaverse, at least so far). Artificial intelligence (AI) has proven to be the former, with large organizations and everyday users racing to adopt applications that are having a real impact on their outputs — even as the technology is evolving seemingly daily. Most observers agree that AI is poised to revolutionize the way organizations achieve their business goals, but the question is what will the future of AI actually look like.
“This is a fascinating time,” said Induprakas Keri, senior vice president and general manager of hybrid multicloud at Nutanix, in an interview with The Forecast.
“Technology has never been more interesting as AI becomes a mainstream workload inside more IT operations.”
AI is having an immense impact on IT strategies and resources, according to the Nutanix State of Enterprise AI Report. Based on input from over 650 IT, DevOps and platform engineering decision makers, 90% of respondents say AI is a priority for their organization, illustrating the massive and lasting impact of the technology industry wide. But with the future of AI comes a whole host of challenges.
In that same report, 90% of respondents emphasize AI security and reliability as a top concern, and most plan to enhance data protection and data recovery capabilities. The report also showed that AI isn't confined to one location, as 83% of respondents plan to boost their AI edge strategy investment. A major concern is that today’s IT infrastructure wasn’t built for AI, as 91% of respondents agree it's time to modernize.
As AI and machine learning technology begin to open up an impossibly large world of possibilities, IT infrastructure must adjust accordingly.
“AI-based services and applications are absolutely made for hybrid multicloud architectures,” said Induprakas Keri, senior vice president and general manager of hybrid multicloud at Nutanix.
“Steps in the AI workflow will happen across various infrastructure environments, with training happening in the cloud, enrichment, refinement, and training in core data centers, and inferencing at the edge. Successfully delivering a cohesive, scale-out infrastructure that can span across this entire AI workflow will be a key to success.”
Keri’s nine predictions about the future of AI:
This one doesn’t exactly require a crystal ball. But to truly grasp the magnitude of the coming change, it’s worth thinking beyond abstract concepts like “revolutionary” and “transformational” to really visualize the eventual outcomes of a technology that is constantly improving itself through machine learning.
“It may be obvious that AI is going to transform things,” Keri said.
“What may be less obvious is that I think we’ll actually use AI as the basis for interstellar travel.”
Keri noted that there is a “huge amount” of excitement surrounding AI, and he compared the hype around AI technology to that surrounding Bitcoin several years ago.
“This is what I call ‘the cab ride test,’” Keri said.
“When you ride in a taxi, and the driver asks you what you think of AI, that’s usually a sign that the bubble is about to froth and spill over.”
This doesn’t mean that companies should pull back their investments, though. On the contrary, Keri said, the projected AI trough presents an opportunity for organizations to move beyond the buzz and do the “real work” of building out and testing new applications.
As AI models proliferate and evolve, organizations will need to ensure that they are up-to-date, secure, and functioning optimally, Keri said.
Effective model management is necessary to guarantee that AI systems are reliable and trustworthy and that they can adapt to changes in the field.
According to Keri, AI is the “ultimate” hybrid cloud use case.
“You use public data to create a foundational model, but that’s not going to be enough for your business,” he said.
“You need to refine and augment a foundational model to make it specific to your business, and that can only really be done in your data center, because the moment you do it with infrastructure as a service, you have lost control of your data. And then all of the inferencing happens at the edge.”
A cloud-first approach may soon be a thing of the past, as cloud-computing alone is not robust enough to address every business’ IT need in the age of AI. Companies will increasingly turn to hybrid cloud to ensure continued data compliance and security, while minimizing latency.
“You can't say that I'm only going to do things on the public cloud or only on the private cloud because typically most organizations are not going to have the compute resources to train foundational models,” said Keri.
With the future of AI, blockchain is poised to make a strong comeback, self-driving cars are poised to become more than just a curiosity and flying cars may soon become reality, argues Keri.
Blockchain once promised transparency and security by using a public and tamper-resistant ledger stored and verified across a vast network of computers. But its growth was ultimately slowed by the technology’s lack of scalability. Paired with the speed and power of AI, blockchain may once again return with a vengeance, bringing cryptocurrency along with it.
AI and machine learning are the most crucial components in self-driving vehicles as the onboard computer can take in information from a wide variety of cameras and sensors to make split-second decisions and even learn to improve its ability to predict and respond over time. As AI fully matures, there will be no other technological challenges to overcome and little will be left standing between self-driving and flying cars hitting the pavement or soaring through the sky.
Developers should not have to think about the infrastructure, said Keri.
“The developers are thinking about the hybrid cloud app and managing their models,” he said.
“What they want, when they’re doing refinement, is for the right model to show up, and the infrastructure can help with that. If the infrastructure understands what a model is – and what a version of a model is – you can get that model from the public cloud and make it available for refinement, without the developer having to fetch data.”
Graphics Processing Units (GPUs) have reigned supreme in the realm of high-performance computing, which powers AI systems – particularly for tasks that require parallel processing, such as video rendering and deep learning.
However, other technologies are poised to challenge GPUs as researchers advance the use of Tensor Processing Units (TPUs), Field-Programmable Gate Arrays (FPGAs), and even general-purpose central processing units (CPUs).
Keri said, “GPUs won’t be king forever.”
Software will eventually help IT systems choose the most available, efficient processing resources.
Unlike “scale-up” architecture, where the expansion is vertical and involves adding more power to an existing machine (such as more CPUs), scale-out infrastructure is horizontal and involves adding more machines or nodes to a network to increase capacity.
“Because it’s a hybrid cloud app – and because the models have to traverse all the way from the edge, to the data center, to the public cloud – you really need to do this with scale-out infrastructure,” Keri said.
Given the intensive demands of AI, this burgeoning technology could pave the way for a new golden age of infrastructure in the IT space. But because of the flexibility and scalability of the cloud, companies can expand their infrastructure carefully and intentionally.
“You want to make sure that with the least effort possible and the least amount of hardware, infrastructure, investments or training, I can gain that insight and then I can scale,” said Keri.
“That means you can start small, and that scalability allows us to say, you don't have to spend a whole lot of effort and money. When you have the right results and when you want to deploy something like that in production, you can easily scale up to hundreds of thousands of nodes.”
Power consumption from inferencing at the edge will become an issue.
“The projected energy demand in the US is expected to double in the next 4 years, to roughly 880 TW,” Keri told IT Cloud News. “This is likely understating the growth if appropriate optimizations are not made to power usage and consumption.
“And contrary to popular perception, inferencing will be the true behemoth of power consumption, not just training.
This is an updated version of the article first published on Novembeer 15, 2023.
Editor’s note: Learn more about the Nutanix Enterprise AI platform, including Nutanix GPT-in-a-Box, a full-stack software-defined AI-ready platform designed to simplify and jump-start your initiatives from edge to core.
Calvin Hennick is a contributing writer. His work appears in BizTech, Engineering Inc., The Boston Globe Magazine and elsewhere. He is also the author of Once More to the Rodeo: A Memoir. Follow him @CalvinHennick.
Ken Kaplan is Editor in Chief for The Forecast by Nutanix. Find him on X @kenekaplan and LinkedIn.
© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.