Hi
I cloned a working space ,but when I try modifying the code and commit to main, I receive this error. My general space is ok, just 59GB used. Also, even just factory reset or restart of the space doesn’t work, it stuck in endless building process, without any output on log file . Any ideas ?
Thanks for reading
I am having exactly same issue.
I tried with hubapi also, but same.
Any fix for this?
The build stack is probably the same as this. Also, the free CPU space was probably 2 cores, 16GB RAM, 0GB VRAM, and 50GB SSD…
I have the same issue too!
Today, I tried to edit my space code, but cannot commit new updates due to storage limit.
Any solution for this? I am using ZeroGPU.
I am using ZeroGPU.
It’s definitely a bug. The storage capacity of the Zero GPU space is so large that it’s impossible to use it all, so there is a very high possibility that the error message is wrong.
Experiencing this too, yesterday was able to commit changes to main.
I reported it to HF Discord. I just reported it.
I wonder if the processing for the capacity limit that suddenly appeared today is buggy.
Please can someone from HuggingFace support fix this issue as soon as posible? It is blocking some important updates we need at the Space.
Same Problem here
Same thing here, i assume a bug with new storage limits change yesterday, even tho repo is under 1 GB
This has been fixed for me now thanks Huggingface
I’m having the same issue. How did you get your fixed?
Also having the same issue still.
Even bought the pro sub.
still facing the same issue. what can you even do with 1GB
We can put data in the Public model repo or dataset repo for free, even if it’s several terabytes, so I think they want us to load it from there when the Spaces program is started.
Downloading to Spaces is very fast.
However, there are some troublesome things when we bring in the github sample as it is.
Anyway, it seems like a bug that a 1GB error occurs even though the file has not reached 1GB. It seems like it would be better not to put large files in the git of Spaces in the future.
Another thing to take into account regarding this topic is that also the commit files are using memory. One way to free some memory is to go to the settings menu, then “Storage usage” and then click “List LFS files”. There you have the list of all commits of your repo sorted by size, so you could delete the old ones, if you do not need them anymore and you can afford it.
Hope this helps
Same Problem here. Cannot commit files to spaces. Did anyone find a fix?
Hi @DevangMahesh Thanks for posting! Spaces are typically used to showcase projects or demos and should only contain code. We do not recommend storing data in a Space repo. Data files can be loaded from other hub repo types (models, datasets) easily into your Gradio app. For example, the hub Python client library allows for a straightforward process to load datasets into your Gradio Space. More information can be found here: 🤗 Hub client library.
There are several benefits when it comes to storing data files in a dataset, such as:
- You have the ability to version your app and data independently
- Your Space will rebuild more quickly
- You can work with the data more efficiently in a dataset, as they are optimized for data
Another option is creating a repo for your Space repository: Create and manage a repository and using your Inference Endpoints instead of Spaces.
We also have some tips and recommendations for freeing up storage in your Space: Storage limits.
Thanks @meganariley . Now I understand that spaces put a limit of 1GB to new/updated repos. Is there any way to purchase storage to increase this limit? I saw there are options to buy persistent storage (20GB, 150GB, …), but after I purchased, the 1GB storage limitation seems still there when I used git push. So I’m confused. A little bit of context: I’d love to migrate to the approach you recommended, but now it’s a demo project for a submitted paper so I’m unable to revise the repo structure a lot.