I have already done the following:
Created an API token on Hugging Face
Checked that my token has “Make calls to inference providers” permission
Tried regenerating a new token and replacing it in my script
Confirmed that my Hugging Face account is active and not restricted
This is the error message i get: I have already done the following:
This is the error message i get:
import requests
HF_TOKEN = “JOUW_NIEUWE_API_TOKEN”
API_URL = “https://api-inference.huggingface.co/models/google/flan-t5-large”
HEADERS = {“Authorization”: f"Bearer {HF_TOKEN}"}
response = requests.get(API_URL, headers=HEADERS)
print(“Response status code:”, response.status_code)
print(“Response JSON:”, response.json())
1 Like
It works for me.
Response status code: 200
Meybe try:
pip install -U huggingface_hub
Is it possible that it does work only with a pro membership?
1 Like
I don’t know… even models that don’t seem to be like that have the same behavior. Maybe the whole thing is buggy.
HF_TOKEN = "my token"
import requests
#API_URL = "https://api-inference.huggingface.co/models/google/flan-t5-large"
API_URL = "https://api-inference.huggingface.co/models/Qwen/Qwen2.5-3B"
HEADERS = {"Authorization": f"Bearer {HF_TOKEN}"}
response = requests.get(API_URL, headers=HEADERS)
print("Response status code:", response.status_code)
print("Response JSON:", response.json())
You mean the language model might be buggy
1 Like
No, not the model itself, but the system of the Hugging Face Inference API. It looks like it’s being renovated.
I did all of that but still don’t work if i message me telegram bot it only react on /start but the ai language model doesn’t work
1 Like
Hmm, I don’t know who is in charge.
website@huggingface.co