Hugging Face Forums
LM example run_clm.py isn't distributing data across multiple GPUs as expected
🤗Transformers
brando
August 17, 2022, 3:03pm
6
does this solve your question:
Using Transformers with DistributedDataParallel — any examples?
How to run an end to end example of distributed data parallel with hugging face's trainer api (ideally on a single node multiple gpus)?
show post in topic
Related topics
Topic
Replies
Views
Activity
How to run an end to end example of distributed data parallel with hugging face's trainer api (ideally on a single node multiple gpus)?
Intermediate
17
18009
September 6, 2023
Running a Trainer in DistributedDataParallel mode
🤗Transformers
1
1450
October 24, 2020
Multi gpu training
🤗Transformers
3
6028
April 24, 2022
Distribute training
🤗Transformers
0
315
November 16, 2022
Which data parallel does trainer use? DP or DDP?
🤗Transformers
6
6420
August 30, 2025