Use this topic to ask your questions to Thomas Wolf during his talk: Transfer Learning and the birth of the Transformers library
This is an example of test question. You can test the like button below.
Hello I have a question!
What are the best ways to do transfer learning on huge models? And what are the best ways to pick metrics for NLP models?
Is Huggingface gradually becoming commercial from open source, there are many parts of API that are for paid now, e.g., model training, please help me understand the portion of open source vs paid.
Is this presentation document going to be shared? Thanks!
Is this presentation document going to be shared? Thanks!
Yes, the slides will be shared after the talk
Is there a systematic way to estimate how many annotated samples will be required to train a custom model (let’s say NER) by finetuning an existing model?
What does Thomas use to build/train models? notebooks? vscode? vim? other???
How did the name HuggingFace and the use of come about?
This is answered at 17:15 on the main stream.
When explaining the reasons for the main revolution in 2018, you pointed out two separate points: (1) transfer learning and (2) transformers for increased efficiency. Does that mean that there is not inherent link between transfer learning and transformers? Transformers are only/mostly used because they are very efficient?
Will the video/recording also be shared later? (It is becoming late here…)
The same YouTube link for streaming will have the video I believe
Will the video/recording also be shared later? (It is becoming late here…)
The live stream can be viewed at any time on YouTube, and we will also edit to share each talk in a separate YouTube video
What about using Graph Neural networks (GNN) in NLP tasks? Graph structures could capture data with complex structures and relationships and GNN provide the opportunity to study and model complex data representation for NLP tasks…
looking into reply for gnn
Interesting! Looking into this reply
can you tell us about the distributed training efforts taken by the hf community?
What about using Graph Neural networks (GNN) in NLP tasks? Graph structures could capture data with complex structures and relationships and GNN provide the opportunity to study and model complex data representation for NLP tasks…
This is answered at 41:00 of the main stream.