COSS Community 🌱

Cover image for OCS 2020 Breakout: Clement Delangue
Joseph (JJ) Jacks for COSS Community

Posted on

OCS 2020 Breakout: Clement Delangue

Clement Delangue is the co-founder and CEO of Hugging Face, the leading NLP startup, based in NYC and Paris, that raised more than $20M from prominent investors. The company created Transformers, the fastest growing open-source library enabling thousands of companies to leverage natural language processing. Prior to Hugging Face, Clement started his machine learning journey at Moodstocks, a start up that built machine learning for computer vision and got acquired by Google.

Relevant Links
LinkedIn - Twitter
Clément Delangue is the co-founder and CEO of Hugging Face, the COSS company driving democratization in NLP with open source at the core.

I want to go to the origins of Hugging Face. Where did you the project originate? Why did you pick the name? Tell me about the beginning and how all this came to be. - 0:15

You’ve built an open-source community that’s very vibrant. You’re close to 40000 stars on GitHub. How do you measure the growth and the size of your community? What are the metrics that you pay attention to? - 3:38

When you created the open-source project, was this with the intention to build a commercial platform as well? How would you think about describing the relationship between your open-source community and Hugging Face the company? - 4:45

This is a big theme that we’re seeing at OCS. Often the commercial companies around the open-source technology exist to serve the open-source and grow the open-source and make it more vibrant, in addition to helping customers. How do you think about different personas of people using the open source vs the customer persona? What kinds of things are you building for customers? - 6:38

(JJ elaborates on Clement’s reference to MongoDB). I’m curious about open-source natural image processing adoption with developers. Hugging Face Transformers is based on PyTorch Tensorflow. You’re integrating these core technologies to make things simpler. What are the biggest points of friction for developers building these kinds of applications, and how does Transformers help simplify things for developers? - 8:58

I wanted to share the screen to show the model architectures in the Transformers repo. You have all these really awesome model architectures. How do people go about deciding what is the right model for them? Do they have to read all the research papers? (Clement recommends - 14:17

So this is really kind of a discoverability solution, so you can find the right company and the right model for your application, right? (JJ elaborates on discoverability in open-source). - 15:51

Talk about your company building process so far. You’re early in your journey, thinking through commercialization approaches, really focused on growing your open-source community. What kind of hires have you made at the company level so far? You’ve also raised a good amount of money to invest in your community and explore some things. Can you also talk about your fundraising journey, and how you’ve built the company up so far? - 18:24

750 contributors is an incredible amount of external contribution. Can you talk about the contributor graph, and how that’s evolved? You mentioned these “spring weeks.” How do you drive external contributions and motivate people? I’d love for you to elaborate more on this external network of contributors, who are not employed by your company, but have contributed a ton of value to the open-source project. - 20:45

You have a very positive-sum mindset as a founder, and I think that’s a huge key to success, especially in a space that’s seen lots of innovation disruption by open source. I love how your mission is to democratize NLP through open source. That’s really exciting. (Clement responds and elaborates on challenges/observations on positive-sum). - 24:36

(JJ elaborates and agrees with Clement’s comments on value capture. Clement shares best places to contact him.) - 27:52

Share your questions and comments below!

Top comments (0)