XLM for translation not working

Hi,
I’m following a tutorial on inference for XLM using the following code.

import torch
from transformers import XLMTokenizer, XLMWithLMHeadModel

tokenizer = XLMTokenizer.from_pretrained("xlm-clm-enfr-1024")
model = XLMWithLMHeadModel.from_pretrained("xlm-clm-enfr-1024")

input_ids = torch.tensor([tokenizer.encode("Hello, World")])

language_id = tokenizer.lang2id["en"]  # 0
langs = torch.tensor([language_id] * input_ids.shape[1])
# We reshape it to be of size (batch_size, sequence_length)
langs = langs.view(1, -1)

outputs = model(input_ids, langs=langs)

The outputs is a MaskedLMOutput. How do I convert this to words?

I tried doing this.

logits = model(input_ids, langs=langs).logits
predicted_token_ids = logits[0].argmax(axis=-1)
print(tokenizer.decode(predicted_token_ids))

It prints out The ello, The