Profile

Finding Open Source AI Models That Fit Business Needs

Taylor Linton of Hugging Face explains how open source innovation is helping IT teams access tools and models to build AI applications faster.

February 5, 2025

He dropped into Hugging Face in medias res. That was 2021 and Taylor Linton saw the the company amassing open-source tools and resources for artificial intelligence and machine learning application developers. There were 15,000 models available to the Hugging Face community when Linton joined. Inventory expanded rapidly to nearly 1 million by the end of 2024. 

In the middle of it all, Linton, who is the founding account executive, supports enterprise sales and partnerships for Hugging Face. 

“We get involved with folks to help them take advantage of open source AI,” he told The Forecast in a video interview.

“Our engineers can sit down with customers and look at what they’re trying to build and what type of constraints they’re working with. We walk through the different trade-offs that you make when picking a different model.”

“Once they grab that model and bring it into their environment, we have maybe 20 different libraries or tools that they use to actually go all the way from building to deploying these models.”

The Open-Source Ecosystem for AI

Hugging Face describes itself as an ecosystem that “helps the AI and machine learning community collaborate on models, datasets, and applications.” But it didn’t start out that way. 

The company launched in 2016 as a conversational chatbot for teens — hence the jazz hands emoji that it uses for its logo, Linton explained. Reaction to the app was tepid at best, but Hugging Face struck gold when it started sharing bits and pieces of the app’s underlying natural language processing (NLP) code online.

Hugging Face was at the right place, at the right time, and with the right stuff. AI and ML had been stirring up interest in boardrooms and businesses wanted an easy way in. HuggingFace offered open-source models on which companies could piggyback and build their way to a slice of the AI pie.

There’s plenty of established precedent for successful open-source companies in tech. Linux is a giant that built an open-source operating system. And GitHub has revolutionized how developers share and use open-source code. Hugging Face wants to do the same for AI. 

Choosing an Open-Source Model

To implement AI, companies need data models. And to implement generative AI, they need extremely large language models (LLMs), which can be painful to build from scratch.

Hugging Face AI offers a shortcut: Instead of reinventing the wheel every time, companies can start with one of its open-source ML models and fine-tune the entity using proprietary data. 

“Not having to start from scratch saves time and computing resources,” Linton said. 

However, choosing the right models is no easy task, especially as more new models are made available each day.

“It’s nearly impossible to automate the process of knowing which model to use and the best way to train a model for a use case,” he said.

Nevertheless, Hugging Face engineers help customers navigate the tradeoffs between latency and accuracy, among many other parameters. 

Companies can also decide where their models will run — on a cloud provider, on-prem, or in a Nutanix environment, explained Linton, who said Hugging Face engineers can also help clients run optimized models on the edge. 

Another convenience that Hugging Face offers: Its LLM library integrates with Nutanix GPT-in-a-Box, an off-the-shelf generative AI solution for enterprises. This allows Nutanix’s and Hugging Face’s shared customers to use Nutanix Enterprise AI infrastructure software to easily consume and execute validated LLMs from Hugging Face.

Despite the appeal, not all companies need to start with an open-source model, especially if they’re testing a proof of concept (PoC).

Cost also is an important consideration, as it’s relatively affordable to run smaller proprietary models on the cloud; only large models are expensive. For that reason, Hugging Face often recommends customers test-drive a smaller in-house proprietary model to gauge accuracy and performance. 

“Once that PoC has been built, that’s where we can work with them to build it with an open-source approach,” Linton said.

The Advantages of Open-Source

Because building models publicly keeps everyone honest, starting with an open-source ML model also helps with data transparency, which is a critical factor in responsible AI. 

The technique of layering in-house data on top of existing models is the best of both open-source and proprietary worlds. Enterprises get the scaffolding of a model that’s already in place; all they have to do is spackle on the walls and make it their own.

Such an approach has both tactical and strategic advantages, Linton said. Because the combo open-source-proprietary model can run in any environment that an enterprise chooses, it bypasses latency issues that plague open-source-only models. With the combo, companies can fine-tune a number of parameters, including ensuring the right hardware for the jobs, giving them more tactical control.

Adding in-house data also improves accuracy, which is especially important in specialties like medicine or finance. 

“The open-source-first approach really gives folks the opportunity to then train the model with highly domain-specific data so models perform better,” Linton said.

Fine-tuning open-source models and controlling where they run also gives companies strategic control over data, which is precious currency. 

“Data is what can be a competitive differentiator for a company. And so if they train models with the right data and fine-tune it the right way, they can really differentiate themselves from the competition,” explained Linton, who said he is grateful to be learning from “the brightest machine learning engineers in the space.”

“I’m very fortunate that I get to crawl through our Slack channel every day where people are posting about different optimization techniques. And then I can go off on my own and look up particular topics to make sure I’m up to speed on everything,” he continued.

Linton especially appreciates the open-source AI community that Hugging Face has built. 

“It’s fun and challenging to be a part of the field,” he said. 

“Everyone’s trying to figure this out together. And that’s partly why the community is so strong, because no one has all the answers. But if people are building and releasing models in public, we’re all able to learn from each other. It’s cool to see how fast this open-source ecosystem has evolved.”

Editor’s note: Learn about the Nutanix AI Partner Program and explore the Nutanix Enterprise AI capabilities in this blog post and video coverage of its release in November 2024.

Poornima Apte is a trained engineer turned technology writer. Her specialties include engineering, AI, IoT, automation, robotics, climate tech, and cybersecurity. Poornima's original reporting on Indian Americans moving to India in the wake of the country's economic boom won her an award from the South Asian Journalists’ Association. Poornima is a proud member of the Cloud (the sky, not the tech kind) Appreciation Society. Find her at wordcumulus.com.

Jason Lopez and Ken Kaplan contributed to this story.

© 2025 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.

Related Articles