Get "using the `__call__` method is faster" warning with DataCollatorWithPadding

I’m getting the same thing using a BertTokenizerFast with DataCollatorWithPadding - the error appears once for each worker every time I loop over a DataLoader. I would prefer not to silence warnings in my training code, but here’s how I’m getting around it (based on this line in the PretrainedTokenizerBase class, referencing this section in the custom logger):

import os
os.environ['TRANSFORMERS_NO_ADVISORY_WARNINGS'] = 'true'
1 Like