Yes, in PyTorch freezing layers is quite easy. It can be done as follows:
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained(“bert-base-cased”, num_labels=1)
for name, param in model.named_parameters():
if name.startswith("..."): # choose whatever you like here
param.requires_grad = False