Lux Capital is leading the round, with Sequoia and Coatue investing in the company for the first time. Some of the startup’s existing investors participated once again. These investors include Addition, Betaworks, AIX Ventures, Cygni Capital, Kevin Durant and Olivier Pomel.
Despite a short life, Hugging Face has had an interesting evolution. When I first covered the company in 2017, the startup was focused on a consumer app. It looked like yet another messaging app. And yet, there was no one on the other side of the conversation as it was a chatbot app for bored teenagers.
That consumer bet hasn’t paid off, but the company kept iterating on its natural language processing technology. Hugging Face released the Transformers library on GitHub and instantly attracted a ton of attention — it currently has 62,000 stars and 14,000 forks on the platform.
With Transformers, you can leverage popular NLP models, such as BERT, GPT-2, T5 or DistilBERT and use those models to manipulate text in one way or another. For instance, you can classify text, extract information, automatically answer questions, summarize text, generate text, etc.
Due to the success of this libary, Hugging Face quickly became the main repository for all things related to machine learning models — not just natural language processing. On the company’s website, you can browse thousands of pre-trained machine-learning models, participate in the developer community with your own model, download datasets and more.
Essentially, Hugging Face is building the GitHub of machine learning. It’s a community-driven platform with a ton of repositories. Developers can create, discover and collaborate on ML models, datasets and ML apps.
Hugging Face also offers hosted services, such as the Inference API that lets you use thousands of models via a programming interface, and the ability to “AutoTrain” your model.
With today’s funding round, the company plans to do more of the same; 10,000 companies are now using Hugging Face in one way or another, so it’s not time to pivot again.
Source: New feed