Hi, I have deployed a model on the Huggingface Hub a long time ago : inokufu/flaubert-base-uncased-xnli-sts · Hugging Face.
It is not working anymore on the Hosted inference API part (on the right hand side of the page). I am facing this issue : “You need to install sacremoses to use FlaubertTokenizer. See sacremoses · PyPI for installation.”
I just have to install sacremoses locally, or change my transformers version, but I don’t understand why it is not working on the HF Hub directly. I see that the Hosted Inference API is working for flaubert/flaubert_base_uncased model, without any updates on the repo.
What should I do to fix this issue ?
Many thanks by advance for your help.