[Nov 15th Event] Thomas Wolf: Transfer Learning and the birth of the Transformers library

Use this topic to ask your questions to Thomas Wolf during his talk: Transfer Learning and the birth of the Transformers library

You can watch it on YouTube or on Twitch at 8am PST

Slides

1 Like

This is an example of test question. You can test the like button below.

3 Likes

Hello I have a question!

2 Likes

What are the best ways to do transfer learning on huge models? And what are the best ways to pick metrics for NLP models?

1 Like

Is Huggingface gradually becoming commercial from open source, there are many parts of API that are for paid now, e.g., model training, please help me understand the portion of open source vs paid.

1 Like

Is this presentation document going to be shared? Thanks!

Is this presentation document going to be shared? Thanks!

Yes, the slides will be shared after the talk :slight_smile:

3 Likes

Is there a systematic way to estimate how many annotated samples will be required to train a custom model (let’s say NER) by finetuning an existing model?

2 Likes

What does Thomas use to build/train models? notebooks? vscode? vim? other???

2 Likes

How did the name HuggingFace and the use of :hugs: come about?

2 Likes

This is answered at 17:15 on the main stream.

2 Likes

When explaining the reasons for the main revolution in 2018, you pointed out two separate points: (1) transfer learning and (2) transformers for increased efficiency. Does that mean that there is not inherent link between transfer learning and transformers? Transformers are only/mostly used because they are very efficient?

Will the video/recording also be shared later? (It is becoming late here…)

The same YouTube link for streaming will have the video I believe :slight_smile:

2 Likes

Will the video/recording also be shared later? (It is becoming late here…)

The live stream can be viewed at any time on YouTube, and we will also edit to share each talk in a separate YouTube video :slight_smile:

4 Likes

What about using Graph Neural networks (GNN) in NLP tasks? Graph structures could capture data with complex structures and relationships and GNN provide the opportunity to study and model complex data representation for NLP tasks…

4 Likes

looking into reply for gnn

Interesting! Looking into this reply

can you tell us about the distributed training efforts taken by the hf community?

What about using Graph Neural networks (GNN) in NLP tasks? Graph structures could capture data with complex structures and relationships and GNN provide the opportunity to study and model complex data representation for NLP tasks…

This is answered at 41:00 of the main stream.