Error: Repository storage limit reached (Max: 1 GB)

There isn’t a way to increase the 1 GB storage limit for a Space repo.

You can free up storage - more tips here: How can I free up storage space in my account/organization?

1 Like

Thanks @meganariley . I’m now migrating the model to hf hub. I noticed some of my models are singleton .safetensor files which are loaded using DiffusionPipeline.from_single_file(“model.safetensors”). Would you recommend to use the standard structure (“model_index.json”, “diffusion_pytorch_model.safetensors”, etc.), or could I efficiently use

path = hf_hub_download("username/model-repo", "model.safetensors")
pipe = DiffusionPipeline.from_single_file(path)

Thanks.

1 Like

At the stage of loading into the pipeline, single_file is in the same state as from_pretrained, so either is fine. However, since single_file performs conversion on the fly, from_pretrained is slightly faster to load.

If you want to save in the converted state, just do as follows.

path_diffusers = "./model_diffusers"
path = hf_hub_download("username/model-repo", "model.safetensors")
pipe = DiffusionPipeline.from_single_file(path)
pipe.save_pretrained(path_diffusers)
#new_pipe = DiffusionPipeline.from_pretrained(path_diffusers) # if load
1 Like

Nice tip. Thanks!

1 Like

I’ve created a sample code/space for the transition.

1 Like