How to freeze distillbert params during finetuning?

working with TFDistilBertForSequenceClassification and trying to fine tune it. How can I freeze the distillbert layers and only restrict training to params of added classification header layers? Below pls find the layers:


Layer (type) Output Shape Param #

distilbert (TFDistilBertMai multiple 66362880
nLayer)

pre_classifier (Dense) multiple 590592

classifier (Dense) multiple 1538

dropout_159 (Dropout) multiple 0

=================================================================
Total params: 66,955,010
Trainable params: 66,955,010
Non-trainable params: 0