T5TokenizerFast will run into deadlock when used with PyTorch's dataloader num_workers > 0

T5TokenizerFast will run into deadlock when used with PyTorch’s dataloader num_workers > 0?

In my recent project, I found that simply using T5TokenizerFast in my dataloader will cause the script to hang forever due to deadlock. T5Tokenizer does not have this problem. Is this a known issue?