Wrong tokenizer paths in I-BERT-Large models

I encountered the error of the wrong tokenizer path while finetuning the pretrained I-BERT-Large model (kssteven/ibert-roberta-large).

  File "/home/yschoi/transformers_yschoi-dev/src/transformers/tokenization_utils_fast.py", line 110, in __init__
    fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
Exception: No such file or directory (os error 2)

This is probably because of the wrong hardcoded '“tokenizer_file” path in ‘tokenizer_config.json’.
‘kssteven/ibert-roberta-large-mnli’ might have the same problem although I’ve not tested it.
I have no problem with ‘kssteven/ibert-roberta-base’.
Please take a look at it.