Can trainer.hyperparameter_search also tune the drop_out_rate?

I’m trying to do hyperparameter searching for the distilBERT model on the sequence classification task. I also want to try different dropout rates. Can the trainer.hyperparameter_search do this? I tried the following code, but it is not working:


1 Like

The model_init is called once at init with no trial. So you need to add a test whether trial is None or not in your model_init.

Thanks for replying. :smile:
Is there an example about using ray tune with huggingface Trainer to do hyperparameter searching for dropout rate? I’m not sure about how to set the ray tune trial object properly in the model_init. :face_with_head_bandage:

I find lots of material about searching for learning rate, batch size and etc. None of them are related to searching for different dropout rates. The documentation said model_init together with the trial object should be able to handle this…