Hugging Face Forums
Which data parallel does trainer use? DP or DDP?
🤗Transformers
brando
August 17, 2022, 3:03pm
3
perhaps useful to you:
Using Transformers with DistributedDataParallel — any examples?
3 Likes
How to run an end to end example of distributed data parallel with hugging face's trainer api (ideally on a single node multiple gpus)?
show post in topic
Related topics
Topic
Replies
Views
Activity
Using Transformers with DistributedDataParallel — any examples?
Intermediate
11
21041
May 8, 2023
What algorithm Trainer uses for multi GPU training (without torchrun)
Beginners
1
855
January 19, 2023
Trainer API for Model Parallelism on Multiple GPUs
🤗Transformers
5
3663
September 10, 2024
Minimal changes for using DataParallel?
Beginners
1
186
June 17, 2024
Transformer model parallel does not work with Pytorch DDP for multi-node training
🤗Transformers
0
495
September 1, 2022