T5 Fine-Tuning for summarization with multiple GPUs

Hi guys,
I hope you all are fine.

I am happy to be a part of this awesome community.

I am trying to fine-tune T5 model for summarization with multiple GPUs. I was following the script from Huggingface Transformer course for summarization from chapter 7 (The link is here.) but I did not find the n_gpu argument in args.

How can I fine-tune the T5 for summarization using multiple GPUs?

Thank you.