Optuna with huggingface

Hello all, Is there any example using Optuna with huggingface?

1 Like

Hi there :wave:

You can find a self-contained example in the “Finding Good HYperparameters with Optuna” section at notebooks/08_model-compression.ipynb at main · nlp-with-transformers/notebooks · GitHub. In short, you can use the hyperparameter_search method from the Trainer.

def hp_space(trial):
    return {"num_train_epochs": trial.suggest_int("num_train_epochs", 5, 10),
        "alpha": trial.suggest_float("alpha", 0, 1),
        "temperature": trial.suggest_int("temperature", 2, 20)}

best_run = trainer.hyperparameter_search(
    n_trials=20, direction="maximize", hp_space=hp_space)