My account is hamstrung by rate limits regardless of which operation I perform, making it impossible to perform any operations on the HF platform. I have waited a few hours, but the issue persists across hours of no usage. I suspect because at its core I have been rate limited from the whoami-v2 API, which is necessary for all other operations.
While it is understandable that rate limits may be in play for heavy operations like uploading large datasets or models to the hub, I am currently facing rate limits for simple operations which are never performed in bulk, like:
huggingface-cli login
(Inference) ig00dani@cfdebd1326d1:/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/hf_cache$ huggingface-cli login
_| _| _| _| _|_|_| _|_|_| _|_|_| _| _| _|_|_| _|_|_|_| _|_| _|_|_| _|_|_|_|
_| _| _| _| _| _| _| _|_| _| _| _| _| _| _| _|
_|_|_|_| _| _| _| _|_| _| _|_| _| _| _| _| _| _|_| _|_|_| _|_|_|_| _| _|_|_|
_| _| _| _| _| _| _| _| _| _| _|_| _| _| _| _| _| _| _|
_| _| _|_| _|_|_| _|_|_| _|_|_| _| _| _|_|_| _| _| _| _|_|_| _|_|_|_|
To log in, `huggingface_hub` requires a token generated from https://huggingface.co/settings/tokens .
Enter your token (input will not be visible):
Add token as git credential? (Y/n) n
Traceback (most recent call last):
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/utils/_http.py", line 406, in hf_raise_for_status
response.raise_for_status()
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/requests/models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 429 Client Error: Too Many Requests for url: https://huggingface.co/api/whoami-v2
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 1633, in whoami
hf_raise_for_status(r)
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/utils/_http.py", line 477, in hf_raise_for_status
raise _format(HfHubHTTPError, str(e), response) from e
huggingface_hub.errors.HfHubHTTPError: 429 Client Error: Too Many Requests for url: https://huggingface.co/api/whoami-v2
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/bin/huggingface-cli", line 8, in <module>
sys.exit(main())
^^^^^^
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/commands/huggingface_cli.py", line 57, in main
service.run()
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/commands/user.py", line 153, in run
login(
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/utils/_deprecation.py", line 101, in inner_f
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/utils/_deprecation.py", line 31, in inner_f
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/_login.py", line 130, in login
interpreter_login(new_session=new_session)
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/utils/_deprecation.py", line 101, in inner_f
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/utils/_deprecation.py", line 31, in inner_f
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/_login.py", line 290, in interpreter_login
_login(token=token, add_to_git_credential=add_to_git_credential)
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/_login.py", line 404, in _login
token_info = whoami(token)
^^^^^^^^^^^^^
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir/Mambaforge/envs/Inference/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 1635, in whoami
raise HTTPError(
requests.exceptions.HTTPError: Invalid user token. If you didn't pass a user token, make sure you are properly logged in by executing `huggingface-cli login`, and if you did pass a user token, double-check it's correct.
(Inference) ig00dani@cfdebd1326d1:/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/hf_cache$
hugginface-cli whoami
(Inference) ig00dani@cfdebd1326d1:/pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/IP_WorkDir$ huggingface-cli whoami
Invalid user token. If you didn't pass a user token, make sure you are properly logged in by executing `huggingface-cli login`, and if you did pass a user token, double-check it's correct.
<!DOCTYPE html>
<html class="" lang="en">
<head>
<meta charset="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1.0, user-scalable=no"
/>
<meta
name="description"
content="We're on a journey to advance and democratize artificial intelligence through open source and open science."
/>
<meta property="fb:app_id" content="1321688464574422" />
<meta name="twitter:card" content="summary_large_image" />
<meta name="twitter:site" content="@huggingface" />
<meta
property="og:title"
content="Hugging Face - The AI community building the future."
/>
<meta property="og:type" content="website" />
<title>Hugging Face - The AI community building the future.</title>
<style>
body {
margin: 0;
}
main {
background-color: white;
min-height: 100vh;
padding: 7rem 1rem 8rem 1rem;
text-align: center;
font-family: Source Sans Pro, ui-sans-serif, system-ui, -apple-system,
BlinkMacSystemFont, Segoe UI, Roboto, Helvetica Neue, Arial, Noto Sans,
sans-serif, Apple Color Emoji, Segoe UI Emoji, Segoe UI Symbol,
Noto Color Emoji;
}
img {
width: 6rem;
height: 6rem;
margin: 0 auto 1rem;
}
h1 {
font-size: 3.75rem;
line-height: 1;
color: rgba(31, 41, 55, 1);
font-weight: 700;
box-sizing: border-box;
margin: 0 auto;
}
p, a {
color: rgba(107, 114, 128, 1);
font-size: 1.125rem;
line-height: 1.75rem;
max-width: 28rem;
box-sizing: border-box;
margin: 0 auto;
}
.dark main {
background-color: rgb(11, 15, 25);
}
.dark h1 {
color: rgb(209, 213, 219);
}
.dark p, .dark a {
color: rgb(156, 163, 175);
}
</style>
<script>
// On page load or when changing themes, best to add inline in `head` to avoid FOUC
const key = "_tb_global_settings";
let theme = window.matchMedia("(prefers-color-scheme: dark)").matches
? "dark"
: "light";
try {
const storageTheme = JSON.parse(window.localStorage.getItem(key)).theme;
if (storageTheme) {
theme = storageTheme === "dark" ? "dark" : "light";
}
} catch (e) {}
if (theme === "dark") {
document.documentElement.classList.add("dark");
} else {
document.documentElement.classList.remove("dark");
}
</script>
</head>
<body>
<main>
<img
src="https://cdn-media.huggingface.co/assets/huggingface_logo.svg"
alt=""
/>
<div>
<h1>429</h1>
<p>We had to rate limit you. If you think it's an error, send us <a href="mailto:website@huggingface.co">an email</a></p>
</div>
</main>
</body>
</html>
Please suggest ASAP how can I fix this?
Reproduction
No response
System info
- huggingface_hub version: 0.27.1
- Platform: Linux-4.18.0-513.5.1.el8_9.x86_64-x86_64-with-glibc2.31
- Python version: 3.11.11
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Running in Google Colab Enterprise ?: No
- Token path ?: /pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/hf_cache/token
- Has saved token ?: False
- Configured git credential helpers:
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.5.1
- Jinja2: 3.1.5
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 10.4.0
- hf_transfer: 0.1.9
- gradio: N/A
- tensorboard: N/A
- numpy: 1.26.4
- pydantic: 2.10.6
- aiohttp: 3.11.11
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/hf_cache/hub
- HF_ASSETS_CACHE: /pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/hf_cache/assets
- HF_TOKEN_PATH: /pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/hf_cache/token
- HF_STORED_TOKENS_PATH: /pfss/mlde/workspaces/mlde_wsp_PI_Gurevych/hf_cache/stored_tokens
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: True
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10