Hi Hugging Face team and community,
I’m currently working on MLPerf inference benchmarking and need to download the Mixtral-8x7B checkpoint using the `mlcr` CLI tool.
I’ve encountered repeated issues with token authentication:
- Tokens generated via the Hugging Face website are only 37 characters long,not sure if that is an issue with the new versions. I have heard of 40.
- Attempts to use these tokens with `curl` and `mlcr` result in “Invalid credentials in Authorization header.”
- I’ve tried both fine-grained and non-fine-grained tokens with read access.
- I can log in via the browser and have accepted the model’s license terms.
Is there a known issue with token generation or CLI authentication? Any help or workaround would be appreciated.
Thanks in advance!
— Nicholas Wakou
1 Like
HF tokens are often 37 characters long, including the hf_
prefix and the characters that follow. However, now that you mention it, I’m not sure if the character count is actually fixed…
There are rare cases where the token content becomes corrupted due to shortcut key operations during copy-paste in the GUI, etc. So I recommend testing the token’s validity using whoami-v2
.
export HF_TOKEN='hf_...'
curl -H "Authorization: Bearer $HF_TOKEN" https://huggingface.co/api/whoami-v2