How to finetune MBART on an single language?

Hello,
Can anyone please suggest, how can I finetune MBART for a specific language.

I found this Asian Bert repo. GitHub - hyunwoongko/asian-bart: Asian language bart models (En, Ja, Ko, Zh, ECJK)
where they used MBART (using mBart by embedding layer pruning) for single language.
I want to do the same.

I am unable find any good resource on this. Any suggestions?

Thanks and Regards