Is the trainer's seed reset at every model_init?

Because seed is a TrainingArgument, we can use it as in hyperparameter optimization if we’d want to, e.g.

def hparams_ray(trial):
    from ray import tune

    return {
        "learning_rate": tune.loguniform(1e-6, 1e-3),
        "per_device_train_batch_size": tune.choice([4, 8, 16, 32]),
         "seed": tune.choice(range(1, 43)),
    }

But I am looking for clarification what happens when we do not include seed here. Is the seed (or any training argument for that matter), reset at each trial? This is important: if it is not reset, then every trial has a different starting seed and hyperparameter search is not independent.

1 Like

Yes, it’s done at that line specifically.

I guess I misunderstand how Ray (or Optuna) interacts with the trainer then.

I thought that the Trainer is never copied/reinitialized but only the model is reinitialized for every trial. You seem to suggest that a new trainer is created for every trial (and in the trainer init, set_seed is called), is that correct?

Pasted the wrong link, sorry. It is reinitialized (with the seed set) at each new call to train here

Great, that was what I was looking for. Thanks!