Transformers Huge Community feedback: 40k

Back in February we shared the second feedback requests on :hugs:Transformers. We’ve got an amazing 500+ responses with a lot of constructive and freeform comments to analyze. This is the second edition of the survey, you can find the first analysis here.

It is thanks to all of your answers that we’re able to steer the library - and the whole Hugging Face ecosystem - in a direction that fits you all; so we wanted to thank you for sharing your thoughts and a little of your time to let us know what you think, what you like and what you dislike. As a community-focused endeavour it is amazing to see that you all care about the direction in which we’re heading, to see the outpour of positive and encouraging comments, and the very constructive feedback you all are ready to give. From all of us at Hugging Face: Thank you!

Let’s try to summarize and share some takeaways from all of your responses.

:male_detective:Who are you?

We’re still looking at the same three big user communities of roughly equal sizes (in the respondents):

  • Researchers make up the biggest part of respondents, at more than 1/3rd of users (Blue)
  • Data scientists are a close second (Red)
  • Machine Learning Engineers (Green)

Alongside this, we’ve asked whether you were more of a beginner in NLP, or more of an experienced user:

:watch:For how long?

The repartition is very similar across different specialties - it is interesting to see that while nearly half of the respondents have been using Transformers for more than a year (Purple + Orange), there remains a significant influx of new users, as ~16% of you have adopted Transformers in the last three months.

:woman_technologist: Work or :artist: fun

We were interested in understanding how our users use transformers - for work or for fun:

We’re happy to observe that for all categories, a huge majority of the user-base uses Transformers for work - More than 85% of respondents in all categories - but that more than the majority uses it for fun as well.

:star: Recommending the library

We’re glad to see that you appreciate the direction in which we’re going - and that most of you would recommend the library to your peers. We wanted to gather more precise feedback, relative to the two main frameworks supported by the library - PyTorch and TensorFlow.

:heart: PyTorch side of the library

Here too, we’re satisfied to see that most of you (>88%) would give an 8-10 score to the PyTorch side of the library. Some interesting takeaways we learned from your answers:

  • The implementation is solid, yet some parts are hard to dive into and modify
  • Needs some more detailed documentation for advanced parts of the library
  • There remains some ** backward-incompatible changes** across versions

:orange_heart: TensorFlow side of the library

While the general sentiment of the TensorFlow side of the library is high, we understand that it is not on par with the rest of the library; we greatly appreciate your feedback, with some examples visible below:

  • Keras examples are lacking
  • PyTorch has better support across the board
  • Few examples detailing how to use transformers and put it in production

:green_book: Documentation

The general sentiment regarding the documentation is good - but some of you still have some very interesting feedback which we’re taking into account. Some examples below:

  • The code and documentation is clear but is sometimes lacking examples of how to put them into practice
  • Occasionally lacking
  • Few errors and bugs across the documentation

:heart_eyes: Likes

What you like the most is:

  • Ease of use of the library
  • Amount of models available
  • API

and most notably: the community itself! A lot of you answered that one aspect that you enjoy the most with HuggingFace is the community built around it. As a community-centered library, we couldn’t be happier than to foster this and continue to build amazing stuff with all of you.

:thinking: Dislikes

What you dislike the most is:

  • Oversimplification of examples
  • Lack of backward compatibility
  • Duplicate code

Thank you all for your feedback. We’ve read each and every one of your comments, and we aim to address the pain points you’ve shared with us.

:hugs: Open Feedback

And finally, since we enjoy HuggingFace-shaped word clouds, here’s a final one containing the freeform comments you have shared with us:

We’re happy to see the most noticeable one is still:

Thanks <= :heart:

7 Likes

I wanted to share that I was so happy to be a part of Wav2Vec2 fine-tuning week!

1 Like