I couldn’t find any examples of the 422 error on Hugging Face because it’s so rare, except for Inference API-related errors… sorry about that.
Although it’s not a 422 error, if a Fatal error occurs, it’s probably because the network connection itself isn’t working properly. In the case below, it seems that the IPv6 setting was the cause, but there are various other possibilities.
opened 06:00PM - 23 Feb 24 UTC
bug
### Describe the bug
I am getting this error "We couldn't connect to 'https://h… uggingface.co' to load this file, couldn't find it in the cached files and it looks like sentence-transformers/all-MiniLM-L6-v2 is not the path to a directory containing a file named config.json." Although I am able to access website using my web browser and also I have tried creating new token and tried that but same result. Unable to use any model.
### Reproduction
_No response_
### Logs
```shell
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like sentence-transformers/all-MiniLM-L6-v2 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
```
### System info
```shell
- huggingface_hub version: 0.20.2
- Platform: Windows-11-10.0.22621-SP0
- Python version: 3.12.2
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: C:\Users\panwa\.cache\huggingface\token
- Has saved token ?: False
- Configured git credential helpers: manager
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.2.1
- Jinja2: 3.1.3
- Graphviz: N/A
- Pydot: N/A
- Pillow: 10.2.0
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 1.26.4
- pydantic: 2.6.1
- aiohttp: 3.9.3
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: C:\Users\panwa\.cache\huggingface\hub
- HF_ASSETS_CACHE: C:\Users\panwa\.cache\huggingface\assets
- HF_TOKEN_PATH: C:\Users\panwa\.cache\huggingface\token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
{'huggingface_hub version': '0.20.2', 'Platform': 'Windows-11-10.0.22621-SP0', 'Python version': '3.12.2', 'Running in iPython ?': 'No', 'Running in notebook ?': 'No', 'Running in Google Colab ?': 'No', 'Token path ?': 'C:\\Users\\panwa\\.cache\\huggingface\\token', 'Has saved token ?': False, 'Configured git credential helpers': 'manager', 'FastAI': 'N/A', 'Tensorflow': 'N/A', 'Torch': '2.2.1', 'Jinja2': '3.1.3', 'Graphviz': 'N/A', 'Pydot': 'N/A', 'Pillow': '10.2.0', 'hf_transfer': 'N/A', 'gradio': 'N/A', 'tensorboard': 'N/A', 'numpy': '1.26.4', 'pydantic': '2.6.1', 'aiohttp': '3.9.3', 'ENDPOINT': 'https://huggingface.co', 'HF_HUB_CACHE': 'C:\\Users\\panwa\\.cache\\huggingface\\hub', 'HF_ASSETS_CACHE': 'C:\\Users\\panwa\\.cache\\huggingface\\assets', 'HF_TOKEN_PATH': 'C:\\Users\\panwa\\.cache\\huggingface\\token', 'HF_HUB_OFFLINE': False, 'HF_HUB_DISABLE_TELEMETRY': False, 'HF_HUB_DISABLE_PROGRESS_BARS': None, 'HF_HUB_DISABLE_SYMLINKS_WARNING': False, 'HF_HUB_DISABLE_EXPERIMENTAL_WARNING': False, 'HF_HUB_DISABLE_IMPLICIT_TOKEN': False, 'HF_HUB_ENABLE_HF_TRANSFER': False, 'HF_HUB_ETAG_TIMEOUT': 10, 'HF_HUB_DOWNLOAD_TIMEOUT': 10}
```
git, github, ssh-keys