related query

When we use on bert-base using custom data, will it finetune all the layers(head and body/core) or fine tune only the last layer by freezing other layers?
Can we add custom layers and finetune training using, please provide examples.

script: main/examples/pytorch/question-answering/

Reviving this thread, please provide answers.

Example command used:
–model_name_or_path bert-base-uncased
–dataset_name squad
–per_device_train_batch_size 12
–learning_rate 3e-5
–num_train_epochs 2
–max_seq_length 384
–doc_stride 128
–output_dir /tmp/debug_squad/