Hello, when I was trying to upload a dataset with over 10000+ files (totally around 600G) to my hugging face repository, I encountered the following problems:
- The uploading speed was quite slow, around 2.7M/s, how can I improve that?
- The uploading process was frequently interruptted by the following EOF error:
Uploading LFS objects: 0% (1/10000), 1.2 GB | 2.7 MB/s,
write |1: broken pipe
write |1: broken pipe
EOF
EOF
error: failed to push some refs to 'https://<my_username>:<my_token>@huggingface.co/datasets/<my_username>/<my_dataset_name>
- My shell comand was:
cd <my_dataset_directory>
git add .
git commit -m "some message"
git push
Btw, I have enabled the LFS for large files uploading by the command huggingface-cli lfs-enable-largefiles <my_dataset_directory>
Really hope that someone can give me some idea about this! Thank you~