Tricks to control/surpress logging output

I think since the logger PR, I have started getting much more logging output.
What is a reasonable level for a training script is ERROR too aggressive? @lysandre ?

from transformers.utils.logging import get_logger, ERROR
logger = get_logger('seq2seq/finetune')
logger.setLevel(ERROR)

Example of logging outputs:

Model name 'google/pegasus-xsum' not found in model shortcut name list (google/reformer-crime-and-punishment). Assuming 'google/pegasus-xsum' is a path, a model identifier, or url to a directory containing tokenizer files.
loading file https://s3.amazonaws.com/models.huggingface.co/bert/google/pegasus-xsum/spiece.model from cache at /home/shleifer/.cache/torch/transformers/3848e191fec9935f1369d13fe14c5d785a9ff32b812559047361f135a53aaf13.efce77b8dcd2c57b109b0d10170fcdcd53f23c21286974d4f66706536758ab6e
loading file https://s3.amazonaws.com/models.huggingface.co/bert/google/pegasus-xsum/added_tokens.json from cache at None
loading file https://s3.amazonaws.com/models.huggingface.co/bert/google/pegasus-xsum/special_tokens_map.json from cache at /home/shleifer/.cache/torch/transformers/bc045ff6a50eaed77dd55c18cb2baf14d21b5288b0fb06615ade9cb08983bd38.d142dfa55f201f5033fe9ee40eb8fe1ca965dcb0f38b175386020492986d507f
loading file https://s3.amazonaws.com/models.huggingface.co/bert/google/pegasus-xsum/tokenizer_config.json from cache at /home/shleifer/.cache/torch/transformers/1cc20bfd3eea66838b0cfe76e83bfd96b26ef1ab1ba0e8119f4850fa781a5710.c7d99a644bcf983aa23a0081b95cf8be1bc67c66b6afc27a266363ca1d8e463e
loading file https://s3.amazonaws.com/models.huggingface.co/bert/google/pegasus-xsum/tokenizer.json from cache at None
loading weights file https://cdn.huggingface.co/google/pegasus-large/pytorch_model.bin from cache at /home/shleifer/.cache/torch/transformers/e95d8a42105877a796b5f878b2ccebfb3ed36c1ae87be48d0a2656afff150431.b6d2710a7db33f51964e675a76e102fa14b361b2471d45ef76b544bcb730ef46
All model checkpoint weights were used when initializing PegasusForConditionalGeneration.

related: @stas solution to https://github.com/huggingface/transformers/issues/3050

1 Like

Direct link to the solution: https://github.com/huggingface/transformers/issues/3050#issuecomment-682167272

1 Like

Also, it’d be really lovely to be able to disable the noise during testing, while keeping normal stdout/stderr channels open.

Something like this:

pytest --loglevel=error -sv tests/test_i_debug.py

edit: done: https://github.com/huggingface/transformers/pull/6816