Hi I would like to freeze only the first few encoder layers in a bert model.
I can do it by looping through the 202 layers and freeze them by order
model = BertForMaskedLM.from_pretrained('bert-base-uncased')
for param in model.parameters()[6:60]:
param.requires_grad = False
But isn’t there a better way to freeze some layers not having to rely on the order?
Thanks