Use Set_epoch for accelerator?

When we are not using accelerator for DDP
We need to use set_epoch after each epoch for the sampler.

sampler = DistributedSampler(dataset) if is_distributed else None
loader = DataLoader(dataset, shuffle=(sampler is None),
                    sampler=sampler)
for epoch in range(start_epoch, n_epochs):
    if is_distributed:
        sampler.set_epoch(epoch)
    train(loader)

Do we need to use set_epoch even when using the accelerator? Or will it take care of it?