Size mismatch for lm_head.weight/bias when loading state_dic for Wav2Vec2ForCTC on MMS french pipeline


Facebook’s MMS was recently added to Transformers. I’m trying to reproduce the sample notebook, as a pipeline for a non-default language (french).

My notebook is here.


pipe = pipeline(model="facebook/mms-1b-l1107", model_kwargs={"target_lang":"fra"})

It fails with a size mismatch issue. Ignoring it seems to load the english adapter, as the result is poor and doesn’t match the demo on the official space.

It looks like this line inside the transformers library is supposed to load the target_lang I passed to the pipeline.

Is there anything obvious I’m overlooking?