Hugging Face Forums
Which data parallel does trainer use? DP or DDP?
🤗Transformers
brando
August 17, 2022, 3:03pm
3
perhaps useful to you:
Using Transformers with DistributedDataParallel — any examples?
3 Likes
How to run an end to end example of distributed data parallel with hugging face's trainer api (ideally on a single node multiple gpus)?
show post in topic
Related Topics
Topic
Replies
Views
Activity
Using Transformers with DistributedDataParallel — any examples?
Intermediate
11
16258
May 8, 2023
What arguments need to be changed when using deepeed in trainer?
🤗Transformers
2
390
July 3, 2021
Distributed Training with Trainer Class is Really Slow
Beginners
0
845
October 24, 2022
Does Trainer use multiple workers on datasets?
🤗Transformers
0
328
July 13, 2023
Why Trainer.predict return numpy array not torch Tensor?
🤗Transformers
0
254
July 25, 2023