Pipeline fill-mask error with custom Roberta tokenizer

this may be helpful to someone else so I’ll keep the post up. The tokenizer needs to be re-saved as a Roberta tokenizer (not BPE) for fill-mask pipline to work. this solution is given here. Adding the suggested code lines fixed it:

from transformers import RobertaTokenizerFast

tokenizer = RobertaTokenizerFast.from_pretrained(tokenizer_folder, return_special_tokens_mask=True, max_length=512)  

tokenizer.save_pretrained(tokenizer_folder)