Training beyond specified 't_total'. Learning rate multiplier set to 0.0. Please set 't_total' of WarmupLinearSchedule correctly

I get the following warning when training a BERT model using pytorch-pretrained-bert-0.6.2 and transformers==3.1.0.

in pytorch_pretrained_bert.optimization._LRSchedule.get_lr()

“Training beyond specified ‘t_total’. Learning rate multiplier set to 0.0. Please set ‘t_total’ of WarmupLinearSchedule correctly.”

Model: bert-base-uncased

issue is marked as closed in gitrepo: https://github.com/huggingface/transformers/issues/556

What can I do to to address this warning, not straight-forward to pass though nowarn parameter - not clear where I can set t_total parameter.