Huggingface sequence classification unfreezing layers

I am using longformer for sequence classification - binary problem

I have downloaded required files

# load model and tokenizer and define length of the text sequence
model = LongformerForSequenceClassification.from_pretrained('allenai/longformer-base-4096',
                                                           gradient_checkpointing=False,
                                                           attention_window = 512)
tokenizer = LongformerTokenizerFast.from_pretrained('allenai/longformer-base-4096', max_length = 1024)

then as shown here I ran the below code

for name, param in model.named_parameters():
     print(name, param.requires_grad)


longformer.embeddings.word_embeddings.weight True
longformer.embeddings.position_embeddings.weight True
longformer.embeddings.token_type_embeddings.weight True
longformer.embeddings.LayerNorm.weight True
longformer.embeddings.LayerNorm.bias True
longformer.encoder.layer.0.attention.self.query.weight True
longformer.encoder.layer.0.attention.self.query.bias True
longformer.encoder.layer.0.attention.self.key.weight True
longformer.encoder.layer.0.attention.self.key.bias True
longformer.encoder.layer.0.attention.self.value.weight True
longformer.encoder.layer.0.attention.self.value.bias True
longformer.encoder.layer.0.attention.self.query_global.weight True
longformer.encoder.layer.0.attention.self.query_global.bias True
longformer.encoder.layer.0.attention.self.key_global.weight True
longformer.encoder.layer.0.attention.self.key_global.bias True
longformer.encoder.layer.0.attention.self.value_global.weight True
longformer.encoder.layer.0.attention.self.value_global.bias True
longformer.encoder.layer.0.attention.output.dense.weight True
longformer.encoder.layer.0.attention.output.dense.bias True
longformer.encoder.layer.0.attention.output.LayerNorm.weight True
longformer.encoder.layer.0.attention.output.LayerNorm.bias True
longformer.encoder.layer.0.intermediate.dense.weight True
longformer.encoder.layer.0.intermediate.dense.bias True
longformer.encoder.layer.0.output.dense.weight True
longformer.encoder.layer.0.output.dense.bias True
longformer.encoder.layer.0.output.LayerNorm.weight True
longformer.encoder.layer.0.output.LayerNorm.bias True
longformer.encoder.layer.1.attention.self.query.weight True
longformer.encoder.layer.1.attention.self.query.bias True
longformer.encoder.layer.1.attention.self.key.weight True
longformer.encoder.layer.1.attention.self.key.bias True
longformer.encoder.layer.1.attention.self.value.weight True
longformer.encoder.layer.1.attention.self.value.bias True
longformer.encoder.layer.1.attention.self.query_global.weight True
longformer.encoder.layer.1.attention.self.query_global.bias True
longformer.encoder.layer.1.attention.self.key_global.weight True
longformer.encoder.layer.1.attention.self.key_global.bias True
longformer.encoder.layer.1.attention.self.value_global.weight True
longformer.encoder.layer.1.attention.self.value_global.bias True
longformer.encoder.layer.1.attention.output.dense.weight True
longformer.encoder.layer.1.attention.output.dense.bias True
longformer.encoder.layer.1.attention.output.LayerNorm.weight True
longformer.encoder.layer.1.attention.output.LayerNorm.bias True
longformer.encoder.layer.1.intermediate.dense.weight True
longformer.encoder.layer.1.intermediate.dense.bias True
longformer.encoder.layer.1.output.dense.weight True
longformer.encoder.layer.1.output.dense.bias True
longformer.encoder.layer.1.output.LayerNorm.weight True
longformer.encoder.layer.1.output.LayerNorm.bias True
longformer.encoder.layer.2.attention.self.query.weight True
longformer.encoder.layer.2.attention.self.query.bias True
longformer.encoder.layer.2.attention.self.key.weight True
longformer.encoder.layer.2.attention.self.key.bias True
longformer.encoder.layer.2.attention.self.value.weight True
longformer.encoder.layer.2.attention.self.value.bias True
longformer.encoder.layer.2.attention.self.query_global.weight True
longformer.encoder.layer.2.attention.self.query_global.bias True
longformer.encoder.layer.2.attention.self.key_global.weight True
longformer.encoder.layer.2.attention.self.key_global.bias True
longformer.encoder.layer.2.attention.self.value_global.weight True
longformer.encoder.layer.2.attention.self.value_global.bias True
longformer.encoder.layer.2.attention.output.dense.weight True
longformer.encoder.layer.2.attention.output.dense.bias True
longformer.encoder.layer.2.attention.output.LayerNorm.weight True
longformer.encoder.layer.2.attention.output.LayerNorm.bias True
longformer.encoder.layer.2.intermediate.dense.weight True
longformer.encoder.layer.2.intermediate.dense.bias True
longformer.encoder.layer.2.output.dense.weight True
longformer.encoder.layer.2.output.dense.bias True
longformer.encoder.layer.2.output.LayerNorm.weight True
longformer.encoder.layer.2.output.LayerNorm.bias True
longformer.encoder.layer.3.attention.self.query.weight True
longformer.encoder.layer.3.attention.self.query.bias True
longformer.encoder.layer.3.attention.self.key.weight True
longformer.encoder.layer.3.attention.self.key.bias True
longformer.encoder.layer.3.attention.self.value.weight True
longformer.encoder.layer.3.attention.self.value.bias True
longformer.encoder.layer.3.attention.self.query_global.weight True
longformer.encoder.layer.3.attention.self.query_global.bias True
longformer.encoder.layer.3.attention.self.key_global.weight True
longformer.encoder.layer.3.attention.self.key_global.bias True
longformer.encoder.layer.3.attention.self.value_global.weight True
longformer.encoder.layer.3.attention.self.value_global.bias True
longformer.encoder.layer.3.attention.output.dense.weight True
longformer.encoder.layer.3.attention.output.dense.bias True
longformer.encoder.layer.3.attention.output.LayerNorm.weight True
longformer.encoder.layer.3.attention.output.LayerNorm.bias True
longformer.encoder.layer.3.intermediate.dense.weight True
longformer.encoder.layer.3.intermediate.dense.bias True
longformer.encoder.layer.3.output.dense.weight True
longformer.encoder.layer.3.output.dense.bias True
longformer.encoder.layer.3.output.LayerNorm.weight True
longformer.encoder.layer.3.output.LayerNorm.bias True
longformer.encoder.layer.4.attention.self.query.weight True
longformer.encoder.layer.4.attention.self.query.bias True
longformer.encoder.layer.4.attention.self.key.weight True
longformer.encoder.layer.4.attention.self.key.bias True
longformer.encoder.layer.4.attention.self.value.weight True
longformer.encoder.layer.4.attention.self.value.bias True
longformer.encoder.layer.4.attention.self.query_global.weight True
longformer.encoder.layer.4.attention.self.query_global.bias True
longformer.encoder.layer.4.attention.self.key_global.weight True
longformer.encoder.layer.4.attention.self.key_global.bias True
longformer.encoder.layer.4.attention.self.value_global.weight True
longformer.encoder.layer.4.attention.self.value_global.bias True
longformer.encoder.layer.4.attention.output.dense.weight True
longformer.encoder.layer.4.attention.output.dense.bias True
longformer.encoder.layer.4.attention.output.LayerNorm.weight True
longformer.encoder.layer.4.attention.output.LayerNorm.bias True
longformer.encoder.layer.4.intermediate.dense.weight True
longformer.encoder.layer.4.intermediate.dense.bias True
longformer.encoder.layer.4.output.dense.weight True
longformer.encoder.layer.4.output.dense.bias True
longformer.encoder.layer.4.output.LayerNorm.weight True
longformer.encoder.layer.4.output.LayerNorm.bias True
longformer.encoder.layer.5.attention.self.query.weight True
longformer.encoder.layer.5.attention.self.query.bias True
longformer.encoder.layer.5.attention.self.key.weight True
longformer.encoder.layer.5.attention.self.key.bias True
longformer.encoder.layer.5.attention.self.value.weight True
longformer.encoder.layer.5.attention.self.value.bias True
longformer.encoder.layer.5.attention.self.query_global.weight True
longformer.encoder.layer.5.attention.self.query_global.bias True
longformer.encoder.layer.5.attention.self.key_global.weight True
longformer.encoder.layer.5.attention.self.key_global.bias True
longformer.encoder.layer.5.attention.self.value_global.weight True
longformer.encoder.layer.5.attention.self.value_global.bias True
longformer.encoder.layer.5.attention.output.dense.weight True
longformer.encoder.layer.5.attention.output.dense.bias True
longformer.encoder.layer.5.attention.output.LayerNorm.weight True
longformer.encoder.layer.5.attention.output.LayerNorm.bias True
longformer.encoder.layer.5.intermediate.dense.weight True
longformer.encoder.layer.5.intermediate.dense.bias True
longformer.encoder.layer.5.output.dense.weight True
longformer.encoder.layer.5.output.dense.bias True
longformer.encoder.layer.5.output.LayerNorm.weight True
longformer.encoder.layer.5.output.LayerNorm.bias True
longformer.encoder.layer.6.attention.self.query.weight True
longformer.encoder.layer.6.attention.self.query.bias True
longformer.encoder.layer.6.attention.self.key.weight True
longformer.encoder.layer.6.attention.self.key.bias True
longformer.encoder.layer.6.attention.self.value.weight True
longformer.encoder.layer.6.attention.self.value.bias True
longformer.encoder.layer.6.attention.self.query_global.weight True
longformer.encoder.layer.6.attention.self.query_global.bias True
longformer.encoder.layer.6.attention.self.key_global.weight True
longformer.encoder.layer.6.attention.self.key_global.bias True
longformer.encoder.layer.6.attention.self.value_global.weight True
longformer.encoder.layer.6.attention.self.value_global.bias True
longformer.encoder.layer.6.attention.output.dense.weight True
longformer.encoder.layer.6.attention.output.dense.bias True
longformer.encoder.layer.6.attention.output.LayerNorm.weight True
longformer.encoder.layer.6.attention.output.LayerNorm.bias True
longformer.encoder.layer.6.intermediate.dense.weight True
longformer.encoder.layer.6.intermediate.dense.bias True
longformer.encoder.layer.6.output.dense.weight True
longformer.encoder.layer.6.output.dense.bias True
longformer.encoder.layer.6.output.LayerNorm.weight True
longformer.encoder.layer.6.output.LayerNorm.bias True
longformer.encoder.layer.7.attention.self.query.weight True
longformer.encoder.layer.7.attention.self.query.bias True
longformer.encoder.layer.7.attention.self.key.weight True
longformer.encoder.layer.7.attention.self.key.bias True
longformer.encoder.layer.7.attention.self.value.weight True
longformer.encoder.layer.7.attention.self.value.bias True
longformer.encoder.layer.7.attention.self.query_global.weight True
longformer.encoder.layer.7.attention.self.query_global.bias True
longformer.encoder.layer.7.attention.self.key_global.weight True
longformer.encoder.layer.7.attention.self.key_global.bias True
longformer.encoder.layer.7.attention.self.value_global.weight True
longformer.encoder.layer.7.attention.self.value_global.bias True
longformer.encoder.layer.7.attention.output.dense.weight True
longformer.encoder.layer.7.attention.output.dense.bias True
longformer.encoder.layer.7.attention.output.LayerNorm.weight True
longformer.encoder.layer.7.attention.output.LayerNorm.bias True
longformer.encoder.layer.7.intermediate.dense.weight True
longformer.encoder.layer.7.intermediate.dense.bias True
longformer.encoder.layer.7.output.dense.weight True
longformer.encoder.layer.7.output.dense.bias True
longformer.encoder.layer.7.output.LayerNorm.weight True
longformer.encoder.layer.7.output.LayerNorm.bias True
longformer.encoder.layer.8.attention.self.query.weight True
longformer.encoder.layer.8.attention.self.query.bias True
longformer.encoder.layer.8.attention.self.key.weight True
longformer.encoder.layer.8.attention.self.key.bias True
longformer.encoder.layer.8.attention.self.value.weight True
longformer.encoder.layer.8.attention.self.value.bias True
longformer.encoder.layer.8.attention.self.query_global.weight True
longformer.encoder.layer.8.attention.self.query_global.bias True
longformer.encoder.layer.8.attention.self.key_global.weight True
longformer.encoder.layer.8.attention.self.key_global.bias True
longformer.encoder.layer.8.attention.self.value_global.weight True
longformer.encoder.layer.8.attention.self.value_global.bias True
longformer.encoder.layer.8.attention.output.dense.weight True
longformer.encoder.layer.8.attention.output.dense.bias True
longformer.encoder.layer.8.attention.output.LayerNorm.weight True
longformer.encoder.layer.8.attention.output.LayerNorm.bias True
longformer.encoder.layer.8.intermediate.dense.weight True
longformer.encoder.layer.8.intermediate.dense.bias True
longformer.encoder.layer.8.output.dense.weight True
longformer.encoder.layer.8.output.dense.bias True
longformer.encoder.layer.8.output.LayerNorm.weight True
longformer.encoder.layer.8.output.LayerNorm.bias True
longformer.encoder.layer.9.attention.self.query.weight True
longformer.encoder.layer.9.attention.self.query.bias True
longformer.encoder.layer.9.attention.self.key.weight True
longformer.encoder.layer.9.attention.self.key.bias True
longformer.encoder.layer.9.attention.self.value.weight True
longformer.encoder.layer.9.attention.self.value.bias True
longformer.encoder.layer.9.attention.self.query_global.weight True
longformer.encoder.layer.9.attention.self.query_global.bias True
longformer.encoder.layer.9.attention.self.key_global.weight True
longformer.encoder.layer.9.attention.self.key_global.bias True
longformer.encoder.layer.9.attention.self.value_global.weight True
longformer.encoder.layer.9.attention.self.value_global.bias True
longformer.encoder.layer.9.attention.output.dense.weight True
longformer.encoder.layer.9.attention.output.dense.bias True
longformer.encoder.layer.9.attention.output.LayerNorm.weight True
longformer.encoder.layer.9.attention.output.LayerNorm.bias True
longformer.encoder.layer.9.intermediate.dense.weight True
longformer.encoder.layer.9.intermediate.dense.bias True
longformer.encoder.layer.9.output.dense.weight True
longformer.encoder.layer.9.output.dense.bias True
longformer.encoder.layer.9.output.LayerNorm.weight True
longformer.encoder.layer.9.output.LayerNorm.bias True
longformer.encoder.layer.10.attention.self.query.weight True
longformer.encoder.layer.10.attention.self.query.bias True
longformer.encoder.layer.10.attention.self.key.weight True
longformer.encoder.layer.10.attention.self.key.bias True
longformer.encoder.layer.10.attention.self.value.weight True
longformer.encoder.layer.10.attention.self.value.bias True
longformer.encoder.layer.10.attention.self.query_global.weight True
longformer.encoder.layer.10.attention.self.query_global.bias True
longformer.encoder.layer.10.attention.self.key_global.weight True
longformer.encoder.layer.10.attention.self.key_global.bias True
longformer.encoder.layer.10.attention.self.value_global.weight True
longformer.encoder.layer.10.attention.self.value_global.bias True
longformer.encoder.layer.10.attention.output.dense.weight True
longformer.encoder.layer.10.attention.output.dense.bias True
longformer.encoder.layer.10.attention.output.LayerNorm.weight True
longformer.encoder.layer.10.attention.output.LayerNorm.bias True
longformer.encoder.layer.10.intermediate.dense.weight True
longformer.encoder.layer.10.intermediate.dense.bias True
longformer.encoder.layer.10.output.dense.weight True
longformer.encoder.layer.10.output.dense.bias True
longformer.encoder.layer.10.output.LayerNorm.weight True
longformer.encoder.layer.10.output.LayerNorm.bias True
longformer.encoder.layer.11.attention.self.query.weight True
longformer.encoder.layer.11.attention.self.query.bias True
longformer.encoder.layer.11.attention.self.key.weight True
longformer.encoder.layer.11.attention.self.key.bias True
longformer.encoder.layer.11.attention.self.value.weight True
longformer.encoder.layer.11.attention.self.value.bias True
longformer.encoder.layer.11.attention.self.query_global.weight True
longformer.encoder.layer.11.attention.self.query_global.bias True
longformer.encoder.layer.11.attention.self.key_global.weight True
longformer.encoder.layer.11.attention.self.key_global.bias True
longformer.encoder.layer.11.attention.self.value_global.weight True
longformer.encoder.layer.11.attention.self.value_global.bias True
longformer.encoder.layer.11.attention.output.dense.weight True
longformer.encoder.layer.11.attention.output.dense.bias True
longformer.encoder.layer.11.attention.output.LayerNorm.weight True
longformer.encoder.layer.11.attention.output.LayerNorm.bias True
longformer.encoder.layer.11.intermediate.dense.weight True
longformer.encoder.layer.11.intermediate.dense.bias True
longformer.encoder.layer.11.output.dense.weight True
longformer.encoder.layer.11.output.dense.bias True
longformer.encoder.layer.11.output.LayerNorm.weight True
longformer.encoder.layer.11.output.LayerNorm.bias True
classifier.dense.weight True
classifier.dense.bias True
classifier.out_proj.weight True
classifier.out_proj.bias True

My questions

  1. why for all layers param.requires_grad is True? Shouldnt it be False at least for classifier. layers? Aren’t we training them?
  2. Does param.requires_grad==True mean that particular layer is freezed? I am confused with wording requires_grad. Does it mean freezed?
  3. if i want to train some of the previous layers as shown here , should I use below code?
for name, param in model.named_parameters():
         if name.startswith("..."): # choose whatever you like here
            param.requires_grad = False
  1. considering it takes a lot of time to train, is there specific recommendation regarding layers that I should train? To begin with I am planning to train -

all layers starting with longformer.encoder.layer.11. and

`classifier.dense.weight` 
`classifier.dense.bias` 
`classifier.out_proj.weight` 
`classifier.out_proj.bias`
  1. Do i need add any additional layers such as dropout or is that already taken care by LongformerForSequenceClassification.from_pretrained? I am not seeing any dropout layers in the above output and that’s why asking the question

That’s because by default, we are training all parameters of a model. Hence, we will compute gradients for all parameters, and update them using gradient descent.

Does param.requires_grad == True mean that particular layer is freezed? I am confused with wording requires_grad . Does it mean freezed?

No, requires_grad=True means that a parameter will get updated if you start training. To freeze, you don’t want to have gradients, hence you would need to set requires_grad to False.

if i want to train some of the previous layers as shown here , should I use below code?

When fine-tuning language models such as BERT, RoBERTa, LongFormer, etc., we typically update all layers. However, recent research showed that this is actually not necessary, you can get similar results just by fine-tuning the biases (!) of the layers.

considering it takes a lot of time to train, is there specific recommendation regarding layers that I should train?

The default is just training all layers, hence you don’t need to set requires_grad to False anywhere.

  1. Do i need add any additional layers such as dropout or is that already taken care by LongformerForSequenceClassification.from_pretrained ? I am not seeing any dropout layers in the above output and that’s why asking the question

This model already includes dropout. It’s not shown when printing the layers as dropout doesn’t have any trainable parameters (no weights or biases). You can see that it’s already included here. As can be seen, the classifier that is placed on top of the base LongFormer model is a LongformerClassificationHead, which includes a dropout layer.

1 Like

just want to say this - @nielsr I appreciate you going through my question and answering each piece of it. You gave detailed answer and it resolved my queries. Most of the times I am required to ask follow up questions. But you approach of quoting each part of my question and then replying seems perfect to me. :slight_smile: