Tensorflow model.summary() doesn't show detail of TFBertModel

I loaded a transformers bert model into tensorflow, using

model_version = ‘bert-base-uncased’
do_lower_case = True
model = TFBertModel.from_pretrained(model_version, output_attentions=True)
tokenizer = BertTokenizer.from_pretrained(model_version, do_lower_case=do_lower_case)

which seemed to work, but the tensorflow model.summary() command doesn’t show as much detail as I would expect.

Model: “tf_bert_model”


Layer (type) Output Shape Param #

bert (TFBertMainLayer) multiple 109482240

Total params: 109,482,240
Trainable params: 109,482,240
Non-trainable params: 0

Has the model been loaded correctly?
Do I need to define the config, even though it is loading a predefined model?
Is there any way to get tensorflow to show what is in the TFBertMainLayer?

(When I loaded a similar bert-base model using keras-bert, tensorflow model.summary() showed a lot more detail)

this is using colab, with
tensorflow v 2.3.0
transformers v 3.0.2
torch v 1.6.0+cu101

2 Likes