Fine-tuning M2M100 & Mbartcc25 for Machine Translation OnetoMany


I am working on a translation algorithm from Breton to French (about 250,000 pairs of sentences)

It was easy for me to use it to create a powerful algorithm in both directions of translation fr->br & br->fr

However, I would like to benefit from this training to be able to translate Breton into English, German or Spanish.

Is there a way to fine tune mbart or m2m100 on a OneToMany or ManyToOne task? If so, what is the easiest way?

Thank you for your help


Any updates?

The model SMALL100 allows the finetuning on a specific pair without loose thé capacité with others langage. But this model less perform than M2M