“OSError: Model name './XX' was not found in tokenizers model name list” - cannot load custom tokenizer in Transformers

Ah, it is because I was using transformers == 3.3.1
After upgrading to v4 and import AlbertTokenizerFast, I received the following error:

from transformers import AlbertTokenizerFast

# Re-create our tokenizer in transformers
tokenizer = AlbertTokenizerFast.from_pretrained("./Sent-AlBERT")

OSError: Can't load tokenizer for './Sent-AlBERT'. Make sure that:

- './Sent-AlBERT' is a correct model identifier listed on 'https://huggingface.co/models'

- or './Sent-AlBERT' is the correct path to a directory containing relevant tokenizer files