How can I convert a model created with fairseq?

hi, @ybelkada, Thanks for your reply. I copied the convert_bart_original_pytorch_checkpoint_to_pytorch.py script and pasted in a convert.py file. I finetuned mbart-cc25 model and stored the checkpoint at /datadrive/checkpoint/checkpoint_best.pt. Now to convert the model I am using this command python convert.py /datadrive/checkpoint/checkpoint_best.pt ~/ --hf_config facebook/mbart-large-cc25 and getting this error: `Traceback (most recent call last):
File “convert.py”, line 137, in
convert_bart_checkpoint(args.fairseq_path, args.pytorch_dump_folder_path, hf_checkpoint_name=args.hf_config)
File “/home/anassmohammad19/nlp/lib/python3.7/site-packages/torch/autograd/grad_mode.py”, line 27, in decorate_context
return func(*args, **kwargs)
File “convert.py”, line 80, in convert_bart_checkpoint
bart = load_xsum_checkpoint(checkpoint_path)
File “convert.py”, line 61, in load_xsum_checkpoint
hub_interface.model.load_state_dict(sd[“model”],strict=False)
File “/home/anassmohammad19/nlp/lib/python3.7/site-packages/fairseq/models/fairseq_model.py”, line 128, in load_state_dict
return super().load_state_dict(new_state_dict, strict)
File “/home/anassmohammad19/nlp/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 1672, in load_state_dict
self.class.name, “\n\t”.join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for BARTModel:
size mismatch for encoder.embed_tokens.weight: copying a param with shape torch.Size([250027, 1024]) from checkpoint, the shape in current model is torch.Size([50264, 1024]).
size mismatch for decoder.embed_tokens.weight: copying a param with shape torch.Size([250027, 1024]) from checkpoint, the shape in current model is torch.Size([50264, 1024]).
size mismatch for decoder.output_projection.weight: copying a param with shape torch.Size([250027, 1024]) from checkpoint, the shape in current model is torch.Size([50264, 1024]).

`