Hi
I cloned a working space ,but when I try modifying the code and commit to main, I receive this error. My general space is ok, just 59GB used. Also, even just factory reset or restart of the space doesn’t work, it stuck in endless building process, without any output on log file . Any ideas ?
Thanks for reading
5 Likes
I am having exactly same issue.
I tried with hubapi also, but same.
Any fix for this?
1 Like
The build stack is probably the same as this. Also, the free CPU space was probably 2 cores, 16GB RAM, 0GB VRAM, and 50GB SSD…
I have the same issue too!
Today, I tried to edit my space code, but cannot commit new updates due to storage limit.
Any solution for this? I am using ZeroGPU.
1 Like
I am using ZeroGPU.
It’s definitely a bug. The storage capacity of the Zero GPU space is so large that it’s impossible to use it all, so there is a very high possibility that the error message is wrong.
Experiencing this too, yesterday was able to commit changes to main.
1 Like
I reported it to HF Discord. I just reported it.
1 Like
I wonder if the processing for the capacity limit that suddenly appeared today is buggy.
Please can someone from HuggingFace support fix this issue as soon as posible? It is blocking some important updates we need at the Space.
2 Likes
Same thing here, i assume a bug with new storage limits change yesterday, even tho repo is under 1 GB
1 Like
This has been fixed for me now thanks Huggingface
1 Like
I’m having the same issue. How did you get your fixed?
1 Like
Also having the same issue still.
1 Like
Respair
December 15, 2024, 9:58am
16
Even bought the pro sub.
still facing the same issue. what can you even do with 1GB
2 Likes
We can put data in the Public model repo or dataset repo for free, even if it’s several terabytes, so I think they want us to load it from there when the Spaces program is started.
Downloading to Spaces is very fast.
However, there are some troublesome things when we bring in the github sample as it is.
Anyway, it seems like a bug that a 1GB error occurs even though the file has not reached 1GB. It seems like it would be better not to put large files in the git of Spaces in the future.
opened 06:33AM - 05 Dec 24 UTC
closed 02:34PM - 05 Dec 24 UTC
bug
### Describe the bug
from HF Forum:
https://discuss.huggingface.co/t/upload-… large-folder-issue-with-uploading-to-spaces/129326
If you run `upload_large_folder()` with a non-existent `space`, you will get an error from `create_repo()`. This is because [`create_repo`()'s `space_sdk` is `optional`, but if it is not there, an error will occur](https://github.com/huggingface/huggingface_hub/blob/v0.26.3/src/huggingface_hub/hf_api.py#L3478).
https://github.com/huggingface/huggingface_hub/blob/v0.26.3/src/huggingface_hub/_upload_large_folder.py#L92
### Reproduction
HF_TOKEN = "hf_*********"
from huggingface_hub import HfApi
api = HfApi(token=HF_TOKEN)
api.upload_large_folder("John6666/lftest", folder_path="test_folder", repo_type="space", private=True)
### Logs
```shell
File "w:\TEMP\test\upload_large_folder_test.py", line 4, in <module>
api.upload_large_folder("John6666/lftest", folder_path="test_folder", repo_type="space", private=True)
File "c:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\huggingface_hub\hf_api.py", line 5473, in upload_large_folder
return upload_large_folder_internal(
File "c:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\huggingface_hub\_upload_large_folder.py", line 92, in upload_large_folder_internal
repo_url = api.create_repo(repo_id=repo_id, repo_type=repo_type, private=private, exist_ok=True)
File "c:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "c:\Users\User\AppData\Local\Programs\Python\Python39\lib\site-packages\huggingface_hub\hf_api.py", line 3479, in create_repo
raise ValueError(
ValueError: No space_sdk provided. `create_repo` expects space_sdk to be one of ['gradio', 'streamlit', 'docker', 'static'] when repo_type is 'space'`
```
### System info
```shell
- huggingface_hub version: 0.26.2
- Platform: Windows-10-10.0.19045-SP0
- Python version: 3.9.13
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Running in Google Colab Enterprise ?: No
- Token path ?: w:\hf\misc\token
- Has saved token ?: False
- Configured git credential helpers: manager
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.4.0+cu124
- Jinja2: 3.1.4
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 11.0.0
- hf_transfer: N/A
- gradio: 4.44.1
- tensorboard: 2.6.2.2
- numpy: 1.23.5
- pydantic: 2.8.2
- aiohttp: 3.9.5
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: w:\hf\misc\hub
- HF_ASSETS_CACHE: w:\hf\misc\assets
- HF_TOKEN_PATH: w:\hf\misc\token
- HF_STORED_TOKENS_PATH: w:\hf\misc\stored_tokens
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
```