Is the Trainer supposed to be saving checkpoints for every process?

I’m noticing that when using the HF Trainer, the Trainer object doesn’t seem to check for main process using the Accelerator. This was a little counterintuitive because I thought that using something like accelerator.wait_for_everyone() was recommended.

Is this intentional? Or am I misunderstanding something?

Thanks.

ref: transformers/src/transformers/trainer.py at 0fdea8607d7e01eb0e38a1ebeb7feee30a22f0cf · huggingface/transformers · GitHub