The model doesn't work on HF, but it works locally

Hi succeed to use my model (sentence transformer trained) locally, but when I push it online, it throws me an error which is
:warning: This model could not be loaded by the inference API. :warning:

My model is private, I succeed to call it from a local Python client with the use_auth_token param, and it works very well.

Maybe it is an issue from HF, how can I make it working on my model card?