What is my batch size..?

Hello, I’m new to multi-gpu training.
Using accelerator, it’s really convenient. Thanks a lot !
But I do wonder my batch size.
because, using accelerator with the following DataLoader command,

train_dataloader = torch.utils.data.DataLoader(train_dataset, batch_size=3, …)

my batch seems to be copied to other GPUs.
So my actual batch size is 12, then ? or still 3 ?