Profile

Shaping the Future of Enterprise AI with Intellectual Curiosity

David Kanter, co-founder of ML Commons, explains the evolution of AI and the enterprising mindset for putting it to good use.

April 17, 2025

As a teenager, David Kanter roamed the internet learning about computers and videogames. Somehow, his relentless pursuits lead him to become a go-to guy for some of the world’s biggest companies and IT decision makers building or investing in advanced technologies that are changing industries and daily life for people around the world.

He’s seen first-hand three decades of inventions and innovations, from computer chip design and manufacturing to the rise of artificial intelligence, coming together to power digital transformation.

“There’s a beauty to all of this coming together,” Kanter told The Forecast.

Related Creating AI to Give People Superpowers
In this Tech Barometer podcast segment, Greg Diamos tells how an early passion for computing led him to a pioneering role at Baidu's Silicon Valley AI lab, where he discovered ways to scale deep learning systems, and went on to co-found MLCommons and AI company Lamini.

March 6, 2024

Kanter is a co-founder and head of MLPerf of ML Commons, a collaborative engineering consortium dedicated to accelerating machine learning innovation and accessibility, primarily known for developing the MLPerf benchmarks. These benchmarks aim to standardize the measurement of machine learning hardware and software performance across various tasks in order to help organizations improve the accuracy, safety, speed, and efficiency of AI technologies they use. 

MLPerf’s primary benchmarks MPerf Inference and MLPerf Training are continually evolving, and MLCommons focus is expanding to cover areas like Storage, Automotive, TinyML, mobile inference, graph neural networks, recommendation systems, large language models, generative AI, energy efficiency, and privacy-preserving machine learning to reflect the latest advancements in the field. MLCommons also supports community development of AILuminate benchmarks to guide AI risk and reliability strategies. 

According to the latest Enterprise Cloud Index (ECI), a survey of 1,500 IT and business decision-makers worldwide, nearly 85% of respondents already had a GenAI deployment strategy in place and nearly all cited difficulties scaling the compute-intensive technology from development to production. This is where MLCommons, an open engineering consortium, helps as an arbiter of AI performance.

Related Study Shows Big Uptake of Enterprise AI and Cloud Native Technologies
As generative AI workloads and cloud native technologies proliferate, global decision-makers surveyed for the 2025 Enterprise Cloud Index cite infrastructure, security and talent issues as top deployment and scalability barriers.

February 12, 2025

AI and ML technologies are new and evolving quickly, so MLPerf performance benchmarks provide standardized measurements that can help enterprise AI systems managers choose the right technologies, best practices and strategies to meet specific needs. AI performance benchmarks can incentivize competition and drive innovation in AI technologies. MLCommons also develops benchmarks for measuring AI safety and reliability.

In an interview with The Forecast, he talked about how the AI/ML environment is evolving, the implications of power-hungry AI applications, and the kind of thinking that will be required to foster AI development in the coming decades.

David Kanter

Intellectual curiosity fueled his journey from teen gamer to chip-industry analyst to AI thought leader. His observations on the past and future of AI should help enterprise IT pros navigate the emerging machine learning landscape.

How it Started: Growing up in Microsoft’s Shadow

Born in 1982, Kanter grew up in a Seattle household where science and technology took center stage. His father was an endocrinologist who had a local clinic. Kanter often stopped by the clinic after school to learn how the computers worked and lend a hand around the office.

“I didn't learn until later in life that most people don't talk about medical cases over the dinner table,” he recalled.

Related Importance of AI Data Storage Performance
How MLPerf Storage benchmark helps AI and ML developers compare performance of different data storage technologies.

November 22, 2024

His Mom used her training as an educator when raising Kanter and his brother and sister became her full-time job. That household helped fire his affection for learning. In middle school and high school, he was scouring computer-review websites for tips on hardware and software selection.

After high school, he studied mathematics, economics and computer science at the University of Chicago. That’s when he started writing for Real World Tech, a site packed with in-depth insights for computer engineers (late item published June 12, 2022). 

“I ended up running the website when I moved out to Silicon Valley in 2005,” he said. 

Related Alex Karargyris’ Path to Becoming a Pioneer of Medical AI
A childhood in Greece set him on a mission to become a global expert on the role of AI/ML in creating therapies and finding cures.

November 10, 2023

In years that followed, he would dig deeper into the worlds of hardware and software, consulting with tech stalwarts like HP, Intel and Microsoft and up-and-comers like Nvidia.

“Process technology to me is absolutely magical,” he said.

The Changing Scale of Technology Development

Kanter noted that while the ranks of semiconductor manufacturers are getting smaller, the technology challenges facing chip designers — and the entire tech sector — are becoming gigantic. Decades ago, small engineering teams could cause tectonic shifts. 

“There was a day when you could come up with a simple team, focus on one novel idea and spawn an entirely new product or processor or family of chips,” Kanter said.

The teams building next-generation processes are nothing like that today. 

Related Enabling AI-Powered Computational Biology in Pursuit of Precision Medicines
Debojyoti “Debo” Dutta, vice president of engineering, AI at Nutanix leads artificial intelligence efforts to accelerate the development of new therapies and enterprise productivity.

March 14, 2024

“It's not two or three people in a room,” Kanter said. “It's a small team, 50 or 60 people. And if you look at it from start to finish, you might have 500 or a thousand people. And that's not even including the manufacturing side, which has really evolved into a totally different discipline.”

He pointed to another example of increasing engineering complexity: airbag systems within autonomous vehicles. While airbags have been in use for years, a self-driving car’s computer might be smart enough to trigger the airbag before a crash, potentially improving safety. Every step of that process must be engineered and integrated into AI/ML systems.

The technology industry has matured to the point where mapping out inventive proposals on a whiteboard or PowerPoint goes only so far. 

“Good ideas are a dime a dozen, but the execution of bringing those good ideas to life is super hard,” Kanter added.

Responding to Rising Energy Demands

Kanter acknowledged that bigger, stronger computing systems consume a lot more electricity. Data center energy consumption, for instance, is expected to rise from 1% of global demand to 2% in the years to come, he noted.

“It doesn't sound like a whole lot, way less than cars, transportation, etc.,” Kanter said. “But as we're shifting into the AI era, a lot of these systems use so much power that we may have to double the footprint of data centers.”

Related Bridging the Gap Between AI’s Promise and Fulfillment
DataRobot CEO Debanjan Saha explains the state of enterprise AI and the challenges of moving beyond the hype to achieve business impact.

February 6, 2025

Diverging challenges will have to be reconciled. For instance, data centers traditionally were built in areas with ample fiber-optic connectivity. 

“In the future, the places in the world that have the most energy may not be the places that have all that fiber connectivity,” Kanter said.  

Even so, Kanter sees reasons for optimism on the energy-consumption front. 

“Some of those older generations of data centers we built in the nineties and two thousands just don't work for AI,” he said. “We need new data centers that can do AI, and that's stressing the whole system.” 

Fortunately, the new data centers will be far more efficient than the older technologies, he added.

Kanter’s colleagues at MLCommons have joined the effort to improve energy efficiency. 

“They discovered a technique that is about 25% more efficient while delivering the same performance, same quality,” he said.

How Thinking on AI Needs to Change

Decades of observing advanced technologies convinced Kanter on the value of a T-shaped perspective — broad overall and deep in one or two areas.

“I've been lucky enough to be extremely T-shaped: There is a huge number of things I know about, from the process technology we use to manufacture chips and storage all the way up to applications like databases and machine learning,” he said. 

“And I've been privileged enough to go deep in a few areas that have been quite valuable. I think if you go too deep and only learn about one thing, you're going to lose some of that wider view of how the thing you're working on is being used.”

Related Get a Grip on Data Storage in Quest for Enterprise AI
In this video interview with The Forecast, Simon Robinson, principal analyst at Enterprise Strategy Group, discusses the complexities of managing data in cloud and hybrid multicloud environments, a challenge that is growing more acute with the rise of enterprise AI applications and data.

April 2, 2025

He uses California’s Napa Wine Country to make the point.

“Right now, I'm recording this from Napa, and the truth of the matter is I'll probably never make wine, but it is kind of cool to know what goes into it and what makes the subtle interplay of the soil, the temperature, the insulation, the grape, how those all play together,” he said. 

Maybe someday he’ll help develop an agriculture app that uses AI to improve grape yields.

The main thing is to have an appetite for experimentation. 

“You learn so much more when you build something,” Kanter said. 

Decisions that seem inexplicable to an outside observer make perfect sense to a builder. 

“Usually it comes down to time-to-market or some other constraint you have to work around.”

Kanter said he can only imagine what the next 20 or 30 years are going to be like. But he knows one proven way to find out: “Some of the best advice out there is to dive in — get your hands dirty.” 

And keep on being curious.

Tom Mangan is a contributing writer. He is a veteran B2B technology writer and editor, specializing in cloud computing and digital transformation. Contact him on his website or LinkedIn.

Ken Kaplan and Jason Lopez contributed to this story.

© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles