Inference turned off for this model?

I wanted to move some of the inference from my servers to HF, but I see that for most models that I use “Inference API has been turned off for this model.” this basically renders the HF inference api useless. am I missing something?
I am willing to pay for GPU usage, but it’s not clear at all that after putting credit card details in place, this will work.

Hi there! Which model are you using?

Note that the error message is for the free Inference API. For Inference Endpoints, you can use 🤗 Inference Endpoints, where you can load the model directly, specify your custom code, and pay for GPUs if that’s needed.

2 Likes