Run_mlm_wwm.py learning_rate confusion

I want to run (or resume) the run_mlm.py script with a higher learning rate, but it doesn’t seem like setting it in the script arguments does anything.

os.system(
    f"python {script} \
        --model_type {model} \
        --config_name './models/{model}/config.json' \
        --train_file './content/{data}/train.txt' \
        --validation_file './content/{data}/test.txt' \
        --learning_rate 6e-4 \
        --weight_decay 0.01 \
        --warmup_steps 6000 \
        --adam_beta1 0.9 \
        --adam_beta2 0.98 \
        --adam_epsilon 1e-6 \
        --tokenizer_name './tokenizer/{model}' \
        --output_dir './{out_dir}' \
        --do_train \
        --do_eval \
        --num_train_epochs 40 \
        --overwrite_output_dir {overwrite} \
        --ignore_data_skip"
)

After warm-up, the log indicates that the learning rate tops out at 1e-05—a default from somewhere, I guess, but I’m not sure where:

{'loss': 3.9821, 'learning_rate': 1e-05, 'epoch': 0.09}