Using hyperparameter-search in Trainer

I’m having some issues with this, under the optuna backend. Here is my hyperparameter space :

def hyperparameter_space(trial):

    return {
        "learning_rate": trial.suggest_float("learning_rate", 1e-6, 1e-4, log=True),
        "per_device_train_batch_size": trial.suggest_categorical("per_device_train_batch_size", [8, 16, 32]),
        "weight_decay": trial.suggest_float("weight_decay", 1e-12, 1e-1, log=True),
        "adam_epsilon": trial.suggest_float("adam_epsilon", 1e-10, 1e-6, log=True)
    }

When I call trainer.hyperparameter_search on this, I find that it varies the number of epochs, too, despite these being fixed in TrainingArguments to 5. The run that’s going now has run 5-epoch trials a few times but now it’s running a 20-epoch trial… Has anyone observed anything like this ?

Thank you very much.