XLMTokenizer in Hosted inference API

Hi,

Since a few weeks, the Hosted Inference API doesn’t work anymore for some models that are using the XLMTokenizer.
For example the FlauBERT models are not working anymore, it says sacremoses need to be installed (flaubert/flaubert_base_uncased · Hugging Face).

Can’t load tokenizer using from_pretrained, please update its configuration: You need to install sacremoses to use XLMTokenizer. See sacremoses · PyPI for installation.

I’ve read that we can’t install new modules in the Hosted Inference API, so how can I solve this issue? I have some models that are based on FlauBERT and doesn’t work anymore either.

Thanks

1 Like

The issue seems fixed now, I don’t know who did it, but thank you! :slight_smile: