Is there a way to access layers in TFBertMainLayer?

Here is the output of the pre-trained BERT, model.layers =
[<transformers.models.bert.modeling_tf_bert.TFBertMainLayer at 0x7f1d5628ac90>, <tensorflow.python.keras.layers.core.Dropout at 0x7f1d51c04cd0>, <tensorflow.python.keras.layers.core.Dense at 0x7f1d51c04f50>] .

Is there a way to access layers in TFBertMainLayer? In particular, I would like to try a particular regularization method for the tokens Embedding() layer, i.e. I would like to access TFBertEmbeddings and the embedding layer that is self.token_type_embeddings = tf.keras.layers.Embedding(
config.type_vocab_size,
config.hidden_size,
embeddings_initializer=get_initializer(self.initializer_range),
name=“token_type_embeddings”,
)
with the purpose of setting ‘embeddings_regularizer=’ to a specific value.