Unable to free up storage

With the upcoming changes to private storage billing, I’m going through and trying to free up space from older no-longer-needed revisions of my datasets.

The docs recommend using super squash to free up the LFS objects.

But super squash does not have an effect on the storage reported in the settings for my datasets.

My current suspicion is that the refs/convert/parquet branch is holding references to those LFS objects even through I’ve squashed the history on main, because reading around it looks like the parquet bot doesn’t actually do any conversion if the dataset is already in parquet and just takes a reference instead.

Unfortunately, it doesn’t look like super squash works on refs/convert/parquet (or refs/convert/duckdb for that matter).

Can someone from the huggingface team confirm if my suspicions are correct, and what options I have to clean these LFS files up that don’t involve manual deletion of each LFS file in the UI?

1 Like

@meganariley @lunarflu I’m calling because it’s related to payment.

@zachoverflow could you please tell me how did you achieve the squash action? I am trying to do it with this repo

using

api.super_squash_history(repo_id=repo_id, token=token)

but getting the error

HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/api/models/valory/trader_agents_performance/super-squash/main

Because it is looking for a model…

Thanks,

Rosa

1 Like

Besides I tried to delete manually the commits from here

and getting this error

So it is impossible to delete anything. Btw we have an Enterprise Account.

Best

Rosa

1 Like