Currently, there is only a very limited amount of BERT-like models for Hindi on the hub: Hugging Face – The AI community building the future. . For this project, the goal is to create a RoBERTa/BERT model for just the Hindi language.
A randomly initialized RoBERTa/BERT model
A masked language modeling script for Flax is available here. It can be used pretty much without any required code changes.
The desired project output is a strong RoBERTa/BERT model in Hindi.
The OSCAR dataset might be too small (it has < 10GB of data for Hindi). Also it might be important
to find datasets the BERT-like model can be evaluated on after pretraining in Hindi. Having found a dataset to fine-tune the pretrained BERT-like model on, one can make use of the text-classification script here
The most important read would be the following colab: