Does Trainer use multiple workers on datasets?

Hey,
when using trainer with a hugging face dataset, does it use multiple workers and prefetching when loading batches (as can be done with torch.DataLoader)?
This can extremely effect training time. Is there a way to prefetch data so that it won’t be a bottleneck for training?

Thanks