Modify the dropout and freeze some layers from Encoder-Decoder models

Hello,

I am working with BART model and I would like to modify the dropout of some layers in the encoder and in the decoder, how can I do it?

Besides, I would like to freeze the first layer from the encoder, do you have any idea to achieve it?

Thank you in advance.

Are you able to do this?

Not entirely sure what you want to modify in the dropout layers, but you can use the following to capture all dropouts in your model and then maybe go from there. Not sure if replacements are allowed\

Hope this helps!!

model = AutoModelForSequenceClassification.from_pretrained("bert-base-cased", num_labels=5)

for m in model.modules():
  if m.__class__.__name__.startswith('Dropout'):
    print(f"The droput:{m}, probability:{m.p}")