Trainer + Datasets + Pytorch Dataloader Workers - how to manage memory usage?

I found a useful post about PyTorch’s DataLoader.