How to get a middle layer from a model with custom head on top of bert model

I built a custom model in which I have a linear head on top of the BERT model:

class BERTRegressor(PreTrainedModel):

    def __init__(self, config, drop_rate=0.2, freeze_camembert=False):

        super(BERTRegressor, self).__init__(config)
        self.backbone_bert =  BertModel.from_pretrained('bert-base-uncased')
        self.regressor = nn.Sequential(
            nn.Dropout(drop_rate),
            nn.Linear(D_in, D_out))

    def forward(self, input_ids, attention_masks):
        outputs = self.backbone_bert(input_ids, attention_masks)
        class_label_output = outputs[1]
        outputs = self.regressor(class_label_output)
        return outputs

It should output 20-width vector so D_in = 768, D_out=20.
I trained this model on my dataset such that the weights of backbone BERT are changed.
Now, I want to fetch sentence embedding from the backbone model after my tranined.

Can how do this? how?

Thanks!