Spaces - Error. API, Status Code: 400

When I submit an input into the spaces model I receive the message:

Error. Could not complete request to HuggingFace API, Status Code: 400, Error: Model requires a Pro subscription

I do have a Pro Subscription, so not sure why this isn’t working.

I have tried rebooting the space but that hasn’t made a difference. Image with the model type and the error is below.

2 Likes

I have the exact same issue, suspiciously with the exact same model! Did you find an answer for it?

1 Like

Hi Pierz. No I didn’t find a solution. I just stopped trying to use that model on spaces. It is a big model so maybe that was the issue. I haven’t had any problems with the 13b or 7b model. Still open to suggestions if anyone has thoughts.

Hi there! For the 7B, 13B, and 70B chat models the Space is using a custom endpoint to make super fast inference and not affect other users. That’s why, when duplicated, it’s not working. You can change it to this endpoint if you have a PRO account. Julien Chaumond on LinkedIn: Llama 2 just landed in Hugging Face Inference API 🔥 It is available for… | 43 comments

1 Like

Thanks Osanseviero, this is useful. I’m new to using hugging face, I’m running the models on spaces currently and have checked out the code for the space but can’t see where the code for the endpoint sits to update it. Do you have any suggestions on how to do this? I do have a pro account.

Hi Stradegio. I see you have a Space (7b chat) which is working fine. Sorry for the confusion, the 7B and 13B Spaces actually run the model directly on the space, so duplicating the base ones and assigning a GPU + adding your token should be enough.