Inference API throwing error during install of Spacy whl


I’ve uploaded my Spacy model to the hub. It’s supposed to be private for now, so I can’t link to it.
When I try to run it on the inference API, I get the following error:

Command '['/usr/local/bin/python', '-m', 'pip', 'install', '--cache-dir', '/data', '']' returned non-zero exit status 1.

Even when I make my model public it still doesn’t work. It seems like it’s failing to install the whl but I have no clue why.

Does anyone know why this happens?

The Inference API does not work for non-transformers repositories at the moment (this is documented in 🤗 Accelerated Inference API ). Do you have a link to the model repository of once you made it public?