As a teenager, David Kanter roamed the internet learning about computers and videogames. Somehow, his relentless pursuits lead him to become a go-to guy for some of the world’s biggest companies and IT decision makers building or investing in advanced technologies that are changing industries and daily life for people around the world.
He’s seen first-hand three decades of inventions and innovations, from computer chip design and manufacturing to the rise of artificial intelligence, coming together to power digital transformation.
“There’s a beauty to all of this coming together,” Kanter told The Forecast.
Kanter is a co-founder and head of MLPerf of ML Commons, a collaborative engineering consortium dedicated to accelerating machine learning innovation and accessibility, primarily known for developing the MLPerf benchmarks. These benchmarks aim to standardize the measurement of machine learning hardware and software performance across various tasks in order to help organizations improve the accuracy, safety, speed, and efficiency of AI technologies they use.
MLPerf’s primary benchmarks MPerf Inference and MLPerf Training are continually evolving, and MLCommons focus is expanding to cover areas like Storage, Automotive, TinyML, mobile inference, graph neural networks, recommendation systems, large language models, generative AI, energy efficiency, and privacy-preserving machine learning to reflect the latest advancements in the field. MLCommons also supports community development of AILuminate benchmarks to guide AI risk and reliability strategies.
According to the latest Enterprise Cloud Index (ECI), a survey of 1,500 IT and business decision-makers worldwide, nearly 85% of respondents already had a GenAI deployment strategy in place and nearly all cited difficulties scaling the compute-intensive technology from development to production. This is where MLCommons, an open engineering consortium, helps as an arbiter of AI performance.
AI and ML technologies are new and evolving quickly, so MLPerf performance benchmarks provide standardized measurements that can help enterprise AI systems managers choose the right technologies, best practices and strategies to meet specific needs. AI performance benchmarks can incentivize competition and drive innovation in AI technologies. MLCommons also develops benchmarks for measuring AI safety and reliability.
In an interview with The Forecast, he talked about how the AI/ML environment is evolving, the implications of power-hungry AI applications, and the kind of thinking that will be required to foster AI development in the coming decades.
Intellectual curiosity fueled his journey from teen gamer to chip-industry analyst to AI thought leader. His observations on the past and future of AI should help enterprise IT pros navigate the emerging machine learning landscape.
Born in 1982, Kanter grew up in a Seattle household where science and technology took center stage. His father was an endocrinologist who had a local clinic. Kanter often stopped by the clinic after school to learn how the computers worked and lend a hand around the office.
“I didn't learn until later in life that most people don't talk about medical cases over the dinner table,” he recalled.
His Mom used her training as an educator when raising Kanter and his brother and sister became her full-time job. That household helped fire his affection for learning. In middle school and high school, he was scouring computer-review websites for tips on hardware and software selection.
After high school, he studied mathematics, economics and computer science at the University of Chicago. That’s when he started writing for Real World Tech, a site packed with in-depth insights for computer engineers (late item published June 12, 2022).
“I ended up running the website when I moved out to Silicon Valley in 2005,” he said.
In years that followed, he would dig deeper into the worlds of hardware and software, consulting with tech stalwarts like HP, Intel and Microsoft and up-and-comers like Nvidia.
“Process technology to me is absolutely magical,” he said.
Kanter noted that while the ranks of semiconductor manufacturers are getting smaller, the technology challenges facing chip designers — and the entire tech sector — are becoming gigantic. Decades ago, small engineering teams could cause tectonic shifts.
“There was a day when you could come up with a simple team, focus on one novel idea and spawn an entirely new product or processor or family of chips,” Kanter said.
The teams building next-generation processes are nothing like that today.
“It's not two or three people in a room,” Kanter said. “It's a small team, 50 or 60 people. And if you look at it from start to finish, you might have 500 or a thousand people. And that's not even including the manufacturing side, which has really evolved into a totally different discipline.”
He pointed to another example of increasing engineering complexity: airbag systems within autonomous vehicles. While airbags have been in use for years, a self-driving car’s computer might be smart enough to trigger the airbag before a crash, potentially improving safety. Every step of that process must be engineered and integrated into AI/ML systems.
The technology industry has matured to the point where mapping out inventive proposals on a whiteboard or PowerPoint goes only so far.
“Good ideas are a dime a dozen, but the execution of bringing those good ideas to life is super hard,” Kanter added.
Kanter acknowledged that bigger, stronger computing systems consume a lot more electricity. Data center energy consumption, for instance, is expected to rise from 1% of global demand to 2% in the years to come, he noted.
“It doesn't sound like a whole lot, way less than cars, transportation, etc.,” Kanter said. “But as we're shifting into the AI era, a lot of these systems use so much power that we may have to double the footprint of data centers.”
Diverging challenges will have to be reconciled. For instance, data centers traditionally were built in areas with ample fiber-optic connectivity.
“In the future, the places in the world that have the most energy may not be the places that have all that fiber connectivity,” Kanter said.
Even so, Kanter sees reasons for optimism on the energy-consumption front.
“Some of those older generations of data centers we built in the nineties and two thousands just don't work for AI,” he said. “We need new data centers that can do AI, and that's stressing the whole system.”
Fortunately, the new data centers will be far more efficient than the older technologies, he added.
Kanter’s colleagues at MLCommons have joined the effort to improve energy efficiency.
“They discovered a technique that is about 25% more efficient while delivering the same performance, same quality,” he said.
Decades of observing advanced technologies convinced Kanter on the value of a T-shaped perspective — broad overall and deep in one or two areas.
“I've been lucky enough to be extremely T-shaped: There is a huge number of things I know about, from the process technology we use to manufacture chips and storage all the way up to applications like databases and machine learning,” he said.
“And I've been privileged enough to go deep in a few areas that have been quite valuable. I think if you go too deep and only learn about one thing, you're going to lose some of that wider view of how the thing you're working on is being used.”
He uses California’s Napa Wine Country to make the point.
“Right now, I'm recording this from Napa, and the truth of the matter is I'll probably never make wine, but it is kind of cool to know what goes into it and what makes the subtle interplay of the soil, the temperature, the insulation, the grape, how those all play together,” he said.
Maybe someday he’ll help develop an agriculture app that uses AI to improve grape yields.
The main thing is to have an appetite for experimentation.
“You learn so much more when you build something,” Kanter said.
Decisions that seem inexplicable to an outside observer make perfect sense to a builder.
“Usually it comes down to time-to-market or some other constraint you have to work around.”
Kanter said he can only imagine what the next 20 or 30 years are going to be like. But he knows one proven way to find out: “Some of the best advice out there is to dive in — get your hands dirty.”
And keep on being curious.
Tom Mangan is a contributing writer. He is a veteran B2B technology writer and editor, specializing in cloud computing and digital transformation. Contact him on his website or LinkedIn.
Ken Kaplan and Jason Lopez contributed to this story.
© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.