Hi miguelvictor and patrickvonplaten Thank you for releasing the multi-lingual checkpoint of GPT-2 model.
Link: miguelvictor/multilingual-gpt2-large · Hugging Face
It is extremely useful. For practical use-cases can you please add/provide details for the model? I have two queries:
(1) how many languages does this model support and what is the name of these languages?
(2) How is the performance of this model in a few downstream languages? Is there any analysis or paper/report?
Looking forward to your responses. Thank you!