Artificial intelligence (AI) is pushing application development into hyperdrive, according to Dan Ciruli, senior director of product management for cloud native software at Nutanix.
“AI will be in the data center,” he said. “It'll be in the cloud, it'll be at the edge, it'll be on your phone, it'll be on your watch. It's going to be everywhere,” Ciruli said in an interview with The Forecast.
“That’s going to mean an order of magnitude more applications.”
That’ll bring more complexity for enterprise IT staff, who’ll need AI-powered automation to ride herd over all those apps.
Ciruli saw this coming more than 20 years ago, when he was getting into blog arguments about the future of servers. While some experts predicted servers would soon go extinct, he argued the opposite: The never-ending thirst for new computing innovations will increase server demand.
“The reason I had such confidence then and now is that success begets success,” he said.
People want more of the benefits of automation, personalization, smartphones and everything else that has infused computing into modern life.
The future vindicated Ciruli’s outlook: Global server sales topped $108 billion last year and could pass $200 billion by the end of this decade, according to Statista.
In a wide-ranging conversation with The Forecast, Ciruli talked about the rise of cloud native development and its ability to dramatically accelerate software timelines. He has decades of IT product management experience, including stints at Google, Zuora and D2iQ, part of which were acquired by Nutanix in late 2023. He explained that while AI and cloud native development are coming together, they aren’t completely converging.
It Starts with Cloud Native Development
For decades, software developers grumbled about inflexible physical servers. Things got better with the advent of virtualization. Weeks or months of waiting for new servers shrank to minutes configuring and spinning up virtual machines. Then came cloud native development, which gave developers seemingly infinite scale and even more speed.
Why did cloud native catch on?
“A bunch of people at a bunch of companies were trying to figure out how to ship software faster,” Ciruli recalled.
After a few years of experimentation, developers’ preferences coalesced on Kubernetes, the software architecture for building and deploying cloud native apps. Kubernetes helps developers orchestrate clusters of containers that operate like miniature operating systems for microservices that turn on and off as needed.
“With cloud native architectures, you could put together a pipeline where a developer wrote some code, pushed it to a source code repository and then it automagically got deployed to a server somewhere,” Ciruli said.
The server’s physical location — somewhere in a vast cloud data center — wasn’t a big concern. All it had to be was available.
Now that Kubernetes is 10 years old, Ciruli said IT teams everywhere think they need it, even if some aren’t clear why.
“People forget that the whole reason you're adopting it is for more velocity,” he said.
Of course, the speed comes at a price.
“Implementing Kubernetes can be really difficult,” Ciruli added. This is where AI comes in.
How Cloud Native Tech Radically Accelerated App Development
Today’s software-is-everywhere reality got a big push from cloud providers like Amazon, Google and Microsoft, who encouraged developers to embrace Kubernetes. The result, Ciruli said, is that pretty much all cloud native development uses Kubernetes containers. While Kubernetes rose to prominence, on-premises development still relied primarily on virtual machines (VMs).
But it didn’t take long for the emerging conventional wisdom to be that “modern” development happens in containers. In recent years, Ciruli has seen more developers adopting cloud-first methodologies. He said another consensus also started emerging.
“Companies are rethinking the economics of cloud computing and realizing that, essentially, renting is more expensive than buying, and doing all of your stuff in the cloud is very expensive,” Ciruli said.
“They’re saying, ‘We like this idea of velocity. We can get changes into production faster, but we need to be able to do that on-prem too’.”
The trend is only just getting started he added.
“Ten years from now, we'll be blown away at where container-based applications are running and what those things are accomplishing,” Ciruli said.
The marriage of Kubernetes and on-prem development hasn’t always been happy. Computing ecosystems were becoming increasingly convoluted and challenging to manage. Indeed, as enterprises pushed for across-the-board IT modernizations, infrastructure software providers like Nutanix have been creating products to help developers simplify their IT architecture.
These speedbumps didn’t deter enterprises from finding value in velocity. Ciruli recalled a discussion with an enterprise development team with a sophisticated VM-based program. They maintained a strong VM development pace with about 300 deployments a year.
“After shifting to container-based development, they went up almost 100x to close to 30,000 deployments per year,” Ciruli said.
These gains illustrate how cloud native development allows exponential increases in apps even as human populations and skills grow at traditional, linear rates.
Emerging Relationship Between AI and Cloud Native Development
Are AI and cloud native technologies converging? Not exactly, according to Ciruli.
“All the big companies are running all of their AI stuff on Kubernetes,” he said. That’s because learning algorithms help them untangle the complexities of containerized development.
Companies that blend on-prem and cloud native development will still need some of their data behind the enterprise firewall, Ciruli said. And they won’t want their data finding its way into large learning models (LLMs) driving apps like ChatGPT. AI applications will help IT teams figure all this out.
So, AI and cloud native will dovetail on the backend. Ciruli cautioned, however, that they are essentially irrelevant to the front-end user experience.
“Ultimately, when you're consuming an application, you don't care if it's running on cloud native,” he said.
“When I open my bank app and I check my balance, that might be running Kubernetes in the background. Eventually, it might be using AI in the background. I don't necessarily know.”
In consumer applications, users don’t need to know the nuts and bolts of the technology they’re using. Traditionally, developers and IT teams accepted that their jobs were different. Creating software and building infrastructure architectures required hands-on exposure to source code.
Now, generative AI apps can make a lot of extremely complex technology tasks invisible to devs and IT pros.
“We've already got AI chatbots built into our software that allow you to say in plain English, ‘why am I seeing this crash loop error and what can I do about it?’,” Ciruli said.
It used to be that cloud native developers and on-prem developers occupied different worlds. Developers working for the same company might have all their Kubernetes experts in one team while devs on other teams had little or no knowledge of the technology, Ciruli said. Kubernetes was so difficult to master that cloud native experts were hard to find, much less hire.
Ciruli doesn’t expect that to last long.
“AI will help us bridge that skills gap by making the technology easier to use,” he said
It’s Hybrid Cloud Now
The app explosion is coming because Kubernetes containers can run just about anywhere. Though it’s not easy to deploy an app anywhere you want, Ciruli said, you can do it with the right training and tools. AI-driven automation will make it easier.
And we’ll soon see AI and cloud native technologies disrupt old categories for mobile, edge, on-prem, public cloud, multicloud and any other computing environment.
“In the beginning of cloud computing, we would talk about hybrid as a temporary state,” Ciruli said.
“And now, we as an industry realize hybrid is not a transitional state. It’s how most organizations will be running for the next decade.”