What to do when HuggingFace throws "Can't load tokenizer"

Whether upon trying the inference API or running the code in “use with transformers” I get the following long error:

“Can’t load tokenizer using from_pretrained, please update its configuration: Can’t load tokenizer for ‘remi/bertabs-finetuned-extractive-abstractive-summarization’. If you were trying to load it from ‘Models - Hugging Face’, make sure you don’t have a local directory with the same name. Otherwise, make sure ‘remi/bertabs-finetuned-extractive-abstractive-summarization’ is the correct path to a directory containing all relevant files for a BertTokenizerFast tokenizer.”

This doesn’t just apply to this specific model but for many models I have tried to run. Is there any solution for this?