Is Huggingface gradually becoming commercial from open source, there are many parts of API that are for paid now, e.g., model training, please help me understand the portion of open source vs paid.
This is answered at 42:14 of the main stream.
Is Huggingface gradually becoming commercial from open source, there are many parts of API that are for paid now, e.g., model training, please help me understand the portion of open source vs paid.
This is answered at 42:14 of the main stream.
Do you see future in symbolic learning rather than probabilistic approaches?
How did the name HuggingFace and the use of come about?
This is answered at 43:45 of the main stream.
Has Huggingface any plans on doing stuff on drug discovery and biological science?
This is answered at 44:40 of the main stream.
Thanks Sylvain. Hope youâre making some inroads over there on using notebooks with tools like nbdev
I would also be interested in hearing Jayâs POV on this if possible
Got it, thank you!
I added the link to Thomâs slides on the top post.
GPT models such as GPT-J and GPT-NEO have large parameters making inference time slower so if we want to reduce inference time what would be your suggestion.
When doing dynamic truncation for tokenization, where the max input length is determined at the batch level⌠have you experimented with engineering the length distributions there? say because if every one of your batches has 1 example that is, idk 10x longer than the average input length for that batch, it doesnât seem to provide as much benefit.