I’m training a NER (token classification) model using RoBERTa and searching best hyperparameters with Trainer.hyperparameter_search()
and Optuna backend but I can’t find out how to set dropout choices on hypermarameter search space. Tried classifier_dropout
and hidden_dropout_prob
but didn’t work out.
Any help appreciated,
J Jones