How to access a particular layer of Huggingface's pre-trained BERT model?

For experimentation purposes, I need to access an Embedding layer of the encoder. That is, assuming Tensorflow implementation, the layer defined as tf.keras.layers.Embedding(…).

For example, what is a way to set ‘embeddings_regularizer=’ argument of the Embedding() layer in the encoder part of the transformer?

Here is the output of model.layers:
[<transformers.models.bert.modeling_tf_bert.TFBertMainLayer at 0x7f1d5628ac90>, <tensorflow.python.keras.layers.core.Dropout at 0x7f1d51c04cd0>, <tensorflow.python.keras.layers.core.Dense at 0x7f1d51c04f50>] .

In other words, is there a way to access layers in TFBertMainLayer?