Hello!
The problem is: I’ve generated several tokens, but no one of them works=(
Errors are: API: Authorization header is correct, but the token seems invalid
Invalid token or no access to Hugging Face
I tried write-token, read-token, token with all permissions.
What am I doing wrong?
1 Like
That’s a rare case. It’s not common for only the token to be wrong.
Perhaps you’re trying to access the gated model but haven’t obtained permission for the individual gated model?
Or maybe.
opened 07:57AM - 04 Sep 24 UTC
bug
### Describe the bug
So my issue is this error, I was working with same 'HF_t… oken' having the write permission and I am working with Mistral Nemo 12B Instruct , the model was working well from last few days without any issue and today suddenly this error appears .
This error appears to be persistant, i have refresh the token and also tried with other model too and also check the hugging face interface api server status as well, but the issue remains the same.
### **Issue**
**BadRequestError
huggingface_hub.utils._errors.BadRequestError: (Request ID: Mq1mDWKbogI0AleJS0HJM)**
**Bad request:
Authorization header is correct, but the token seems invalid**
This is my ```app.py``` file, and I have not modified or added any other files:
```
from huggingface_hub import InferenceClient
from dotenv import load_dotenv
# Load environment variables from .env file
load_dotenv()
# Authenticate with Hugging Face
HFT = os.getenv('HF_TOKEN')
client = InferenceClient(model="mistralai/Mistral-Nemo-Instruct-2407", token=HFT)
#results
response = ""
for message in client.chat_completion(
messages=[system_role, user_prompt],
max_tokens=3000,
stream=True,
temperature=0.35,
):
response += message.choices[0].delta.content
```
### Reproduction
_No response_
### Logs
```shell
Traceback (most recent call last):
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\huggingface_hub\utils\_errors.py", line 304, in hf_raise_for_status
response.raise_for_status()
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\requests\models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api-inference.huggingface.co/models/mistralai/Mistral-Nemo-Instruct-2407/v1/chat/completions
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\flask\app.py", line 1498, in __call__
return self.wsgi_app(environ, start_response)
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\flask\app.py", line 1476, in wsgi_app
response = self.handle_exception(e)
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\flask\app.py", line 1473, in wsgi_app
response = self.full_dispatch_request()
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\flask\app.py", line 882, in full_dispatch_request
rv = self.handle_user_exception(e)
File "C:\Users\Admin\Documents\_app_30082024\envo\lib\site-packages\flask\app.py", line 880, in full_dispatch_request
rv = self.dispatch_request()
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\flask\app.py", line 865, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) # type: ignore[no-any-return]
File "C:\Users\Admin\Documents\app_30082024\app5.py", line 78, in process_file
per_data = Model_PersonalDetails_Output(resume, client)
File "C:\Users\Admin\Documents\app_30082024\utility\llm_generate.py", line 127, in Model_PersonalDetails_Output
for message in client.chat_completion(
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\huggingface_hub\inference\_client.py", line 837, in chat_completion
data = self.post(
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\huggingface_hub\inference\_client.py", line 304, in post
hf_raise_for_status(response)
File "C:\Users\Admin\Documents\app_30082024\envo\lib\site-packages\huggingface_hub\utils\_errors.py", line 358, in hf_raise_for_status
raise BadRequestError(message, response=response) from e
huggingface_hub.utils._errors.BadRequestError: (Request ID: Mq1mDWKbogI0AleJS0HJM)
Bad request:
Authorization header is correct, but the token seems invalid
```
### System info
```shell
- huggingface_hub version: 0.24.6
- Platform: Windows-10-10.0.19045-SP0
- Python version: 3.10.10
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: C:\Users\Admin\.cache\huggingface\token
- Has saved token ?: True
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.4.0
- Jinja2: 3.1.4
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 10.4.0
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 1.26.4
- pydantic: N/A
- aiohttp: N/A
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: C:\Users\Admin\.cache\huggingface\hub
- HF_ASSETS_CACHE: C:\Users\Admin\.cache\huggingface\assets
- HF_TOKEN_PATH: C:\Users\Admin\.cache\huggingface\token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
```