I get the following warning when training a BERT model using pytorch-pretrained-bert-0.6.2 and transformers==3.1.0.
in pytorch_pretrained_bert.optimization._LRSchedule.get_lr()
âTraining beyond specified ât_totalâ. Learning rate multiplier set to 0.0. Please set ât_totalâ of WarmupLinearSchedule correctly.â
Model: bert-base-uncased
issue is marked as closed in gitrepo: https://github.com/huggingface/transformers/issues/556
What can I do to to address this warning, not straight-forward to pass though nowarn parameter - not clear where I can set t_total parameter.