Hi! New to huggingface - dumb question. I created a pro account because I wanted to run inference with a particular model (OpenBioLLM 70B). It gave me $2 of inference and then “pay as you go.” I set up my code calling this model - everything worked on some small test dataset - so I decided to run it on the full load. But now I get an error response saying “You have exceeded your monthly included credits for Inference Providers. Pay-as-you-go above your included PRO quota will be available soon.” My understanding was I could pay above the $2 “free” usage. How do I enable that? What do they mean by “available soon?” Thanks so much for the help!
What do they mean by “available soon?”
It seems that we should just think of it as “under construction” or “preparations in progress”.
So if I wanted to pay for inference time on a model through API calls there is no way to do this currently?
no, but it’s coming very soon!
I’m really looking forward to the launch as soon as possible, because I’m facing the same issue:
{“error”:“You have exceeded your monthly included credits for Inference Providers. Pay-as-you-go above your included PRO quota will be available soon.”}
Any update on how “soon” it will be coming? This hasn’t been working for almost a week now, thanks!
I am getting a similar issue
HfHubHTTPError: 402 Client Error: Payment Required for url: https://router.huggingface.co/hf-inference/models/meta-llama/Llama-3.3-70B-Instruct/v1/chat/completions (Request ID: Root=1-67df75db-40cf455e0033d40d6dbefb32;b2f30677-ab26-4ee1-81c5-db9b96b6b24d)
You have exceeded your monthly included credits for Inference Providers. Subscribe to PRO to get 20x more monthly included credits.
I have an enterprise account which also comes with 2$ worth of inference - and it showed this on my account - but now its saying I owe money.
I was at 48 cents worth of inference usage and it now says I owe 38 cents! Its as if they just removed the credit!!!
This update.
I’ve seen several reports that Pro Quota is not recognized with Enterprise accounts, but I think this is probably a mistake in the conditional branch… @meganariley
I think the tokens themselves are the same, and it’s just that they have attributes like Pro or Enterprise.