How to set the number of trainable layers in BERT with TF V2.0?

I want to build a NLP classification model with BERT for tokenization and embedding, and then add a head for classification. I want to freeze the BERT layer first and train the classifier only. Then I want to make the only last 2 BERT layers trainable and train it with the classifier again.

How do I do that? Thanks.