OSError: Can't load tokenizer for 'facebook/xmod-base'

If you were trying to load it from ‘https: / / huggingface.co / models’, make sure you don’t have a local directory with the same name. Otherwise, make sure ‘facebook / xmod-base’ is the correct path to a directory containing all relevant files for a XLMRobertaTokenizerFast / BertTokenizerFast / GPT2TokenizerFast / BertJapaneseTokenizer / BloomTokenizerFast / CodeGenTokenizerFast tokenizer

If you are getting the error while initializing the tokenizer, do this instead:

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-base")

As per documentation, the model model reuses the tokenizer of XLM-R