How to get around rate limits?

It works with

huggingface-cli upload mysocratesnote/jfk-files-text ~/Desktop/extracted_text/releases --repo-type=dataset

But it’s recommending I do it another way:

Consider using hf_transfer for faster uploads. This solution comes with some limitations. See Environment variables for more details.

It seems you are trying to upload a large folder at once. This might take some time and then fail if the folder is too large. For such cases, it is recommended to upload in smaller batches or to use HfApi().upload_large_folder(...)/huggingface-cli upload-large-folder instead. For more details, check out Upload files to the Hub.

Start hashing 73480 files.

Finished hashing 73480 files.

1 Like