Hugging Face Forums
Trainer + Datasets + Pytorch Dataloader Workers - how to manage memory usage?
🤗Transformers
John6666
April 29, 2025, 10:59am
2
I found a useful post about PyTorch’s DataLoader.
show post in topic
Related topics
Topic
Replies
Views
Activity
A streaming dataset's memory footprint continually grows
🤗Datasets
8
124
June 19, 2025
Does Trainer use multiple workers on datasets?
🤗Transformers
0
532
July 13, 2023
Prevent iterable dataset from consuming all the rams
Beginners
2
26
June 24, 2025
RAM memory issues while training with torch.distributed.launch
🤗Transformers
1
1031
October 19, 2022
Training with IterableDataset is very slow when using a large number of workers
🤗Transformers
0
1303
August 19, 2023