How to freeze some layers of BertModel

You should not rely on the order returned by the parameters method as it does not necessarily match the order of the layers in your model. Instead, you should use it on specific part of your models:

modules = [L1bb.embeddings, *L1bb.encoder.layer[:5]] #Replace 5 by what you want
for module in mdoules:
    for param in module.parameters():
        param.requires_grad = False

will freeze the embeddings layer and the first 5 transformer layers.

7 Likes