Preventing every dropout in the GPT2DoubleHeadsModel

Hello,

If I execute the lines below, would they prevent every dropout in the GPT2DoubleHeadsModel? Or do I need to change extra settings in order to completely prevent the dropout in my GPT2DoubleHeadsModel? Thank you,

    # disable dropouts for the model_gpt2_double_heads.
    model_gpt2_double_heads.config.resid_pdrop = 0.
    model_gpt2_double_heads.config.embd_pdrop = 0.
    model_gpt2_double_heads.config.attn_pdrop = 0.
    model_gpt2_double_heads.config.summary_first_dropout = 0.

You should do that when instantiating your model:

GPT2DoubleHeadsModel.from_pretrainec(checkpoint, resid_pdrop=0, embd_pdrop=0., ...)

for instance

Thank you for your reply.
Is there more dropouts for GPT2DoubleHeadsModel other than the resid_pdrop, embed_pdrop, attn_pdrop, and summary_first_dropout?

Thanks again,

Those seem to be the only ones (we can see all parameters in the config file of GPT-2 here).

Hi @h56cho and @sgugger ,

I hope you are well. sorry, I am doing gpt-neo fine tunning. for controlling the overfitting I want to increase drop out in the model to -.2 . can I do it as recommended above? after that can I use the model for fine tunning as a pretrained model? or it needs training from scratch? many thanks