Hope you all are doing good.
I am making an english - marathi translator, i fine tuned different pre trained models (IndicBert(AI4Bharat) , facebook’s mbart50) on my english - marathi dataset which has 3.5 million rows.
But i achieved lowest loss of 1.2. I want to further lower my loss.
Anyone please find time and suggest some ways to improve my model’s loss.
I also tried to add some custom layers(LSTM, Conv1d, Linear layers) to the pretrained indic bert model body as the model is small in size, but did not achieved good results.
I could also provide the github repo link if anyone wants to have a look at my code.
Any of your inputs will be highly appreciated.
Thank You in advance.