Hello everyone, literally 5-6 hours ago I started encountering questionable errors. Models and datasets refuse to download, even those that I have published.Disabling XET did not help. Reinstalling libraries, sudo apt upgrade/update, reinstalling certificates too.Reinstalling libraries, sudo apt upgrade/update, reinstalling certificates too. I found that the problem is https://cas-bridge.xethub.hf.co - it does not respond to the curl request. Has anyone ever encountered this?
Domain (false) blacklisting issue�
I dont think so, i dont work in secure corporate VPN network:)
Hi @Kostya165 - Xet team member here; so sorry to hear youβre running into problems!
Could you provide more information about what youβre experiencing? Specifically:
- Some example repos where youβre experiencing this.
- What APIs youβre using to download with (e.g.,
huggingface_hub
βssnapshot_download
ortransformers
βsfrom_pretrained
, etc); or, if youβre using the web browser/curl, some information about that (e.g., what web browser) - When you were using
hf-xet
, what version of the package where you using? A full output ofpip freeze
would be helpful. - What platform/OS youβre using
- Any error statements or logs
Hello
I use jetbrains my python version is 3.11
It doesnt work with NVILA model:
I would like to share my code here and Iβm currently in entreprise VPN:
import os
os.environ['REQUESTS_CA_BUNDLE'] = r'mycert.pem'
os.environ['CURL_CA_BUNDLE'] = os.environ['REQUESTS_CA_BUNDLE']
#os.environ['HF_HUB_DISABLE_SSL_VERIFY'] = '1'
from huggingface_hub import snapshot_download
from transformers import AutoTokenizer, AutoModelForCausalLM
# --- CONFIGURATION ---
REPO_ID = "Efficient-Large-Model/NVILA-8B-Video"
LOCAL_DIR = os.path.abspath(r'model_nvila_8B')
if not os.path.exists(LOCAL_DIR):
snapshot_download(
repo_id=REPO_ID,
repo_type="model",
local_dir=LOCAL_DIR,
)
else:
print("Model already downloaded")
please find out log:
{"timestamp":"2025-06-17T13:18:14.390107Z","level":"WARN","fields":{"message":"Reqwest(reqwest::Error { kind: Request, url: \"https://cas-server.xethub.hf.co/reconstruction/688bce9d9130b743bf92cda612964528312d5d3c97a990e7c3e4a9db97428c6f\", source: hyper_util::client::legacy::Error(Connect, Custom { kind: InvalidData, error: InvalidCertificate(UnknownIssuer) }) }). Retrying..."},"filename":"D:\\a\\xet-core\\xet-core\\cas_client\\src\\http_client.rs","line_number":242}
Stacktrace python
RuntimeError: Data processing error: CAS service error : ReqwestMiddleware Error: Request failed after 5 retries
HI @hadjuse I believe this is likely a similar issue to Xet hub InvalidCertificate(UnknownIssuer) Β· Issue #389 Β· huggingface/xet-core Β· GitHub
Weβll investigate there and follow up here, but feel free to add your report to that issue as well so you can follow along and get updates.
Hello! Thanks for the report. We introduced some changes to the cert loading to fix some other reported issues, but it seems that it wasnβt fully robust. I added a few changes in to try to ensure these cases are covered too. Could you all see if pip install hf_xet==v1.1.5-rc1
fixes your issues @Kostya165 @John6666 @hadjuse ?
Seems working well for me. (Windows (not WSL2), Python 3.9)
pip install -U huggingface_hub[hf_xet]
import os
from huggingface_hub import snapshot_download
from transformers import AutoTokenizer, AutoModelForCausalLM
# --- CONFIGURATION ---
REPO_ID = "Efficient-Large-Model/NVILA-8B-Video"
LOCAL_DIR = os.path.abspath(r'model_nvila_8B')
if not os.path.exists(LOCAL_DIR):
snapshot_download(
repo_id=REPO_ID,
repo_type="model",
local_dir=LOCAL_DIR,
)
else:
print("Model already downloaded")
README.md: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 4.29k/4.29k [00:00<00:00, 2.14MB/s]
added_tokens.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 194/194 [00:00<00:00, 64.6kB/s]
generation_config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 243/243 [00:00<00:00, 122kB/s]
config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 226k/226k [00:00<00:00, 8.70MB/s]
merges.txt: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1.67M/1.67M [00:00<00:00, 21.1MB/s]
config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 186k/186k [00:00<00:00, 1.19MB/s]
.gitattributes: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1.52k/1.52k [00:00<00:00, 759kB/s]
model.safetensors.index.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 27.8k/27.8k [00:00<00:00, 9.25MB/s]
special_tokens_map.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 555/555 [00:00<00:00, 277kB/s]
tokenizer_config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2.30k/2.30k [00:00<00:00, 1.15MB/s]
config.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 313/313 [00:00<00:00, 156kB/s]
trainer_state.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 128k/128k [00:00<00:00, 860kB/s]
config.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 650/650 [00:00<00:00, 217kB/s]
vocab.json: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 3.38M/3.38M [00:00<00:00, 4.32MB/s]
preprocessor_config.json: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 394/394 [00:00<00:00, 131kB/s]
mm_projector/model.safetensors: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 58.8M/58.8M [01:14<00:00, 789kB/s]
llm/model-00001-of-00004.safetensors: 0%| | 3.39M/4.87G [01:11<29:12:40, 46.3kB/s]
llm/model-00003-of-00004.safetensors: 0%| | 2.27M/4.33G [00:38<9:42:44, 124kB/s]
llm/model-00004-of-00004.safetensors: 7%|ββββββββββ | 81.2M/1.09G [01:10<13:31, 1.24MB/s]
llm/model-00004-of-00004.safetensors: 14%|ββββββββββββββββββ | 148M/1.09G [01:14<06:11, 2.53MB/s]
vision_tower/model.safetensors: 4%|βββββ | 30.4M/827M [01:08<1:09:24, 191kB/s]
mm_projector/model.safetensors: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 58.8M/58.8M [01:14<00:00, 789kB/s]
Hello everyone β we just released a new version of hf_xet to address this issue. Please upgrade to the newest release (pip install --upgrade hf_xet==v1.1.5
) if you encounter any of these issues.