Can't load tokenizer using from_pretrained, Inference API

Can't load tokenizer using from_pretrained, please update its configuration: data did not match any variant of untagged enum PyPreTokenizerTypeWrapper at line 560, column 3

Here are my project files: Files.
I could use the model locally from the local checkpoint folder after the finetune; however, when I upload the same checkpoint folder on hugginceface as a model, it doesn’t seem to work anymore. Could anyone please help me with that! Thank you!

Thanks for reporting, will forward to the team!

Duplicate of Model Inference API error

Thank you! Looking forward to hearing from the team!!!

Could use the same model uploaded on Kaggle. However, when used with AutoTokenizer, it complained, worked with MBart50Tokenizer.

Update: Works with MBart50Tokenizer from the HuggingFace checkpoint as well. Here is the Space I was testing it.