Hi. I am trying to use BERT with CNNBiLSTM for text classification but seems to be having an incompatibility issue with the transformer and TensorFlow. I used the latest version of TensorFlow version: 2.15.0 and Transformers version: 4.38.2. But once I run this:
from transformers import BertTokenizer, TFBertForSequenceClassification
model = TFBertForSequenceClassification.from_pretrained(âbert-base-uncasedâ)
tokenizer = BertTokenizer.from_pretrained(âbert-base-uncasedâ)
def bert_model(model_name=âbert-base-uncasedâ, max_length=50):
bert_encoder = TFBertModel.from_pretrained(model_name)
input_ids = Input(shape=(max_length,), dtype=tf.int32, name=âinput_idsâ) # Correct variable name used here
attention_mask = Input(shape=(max_length,), dtype=tf.int32, name=âattention_maskâ)
bert_output = bert_encoder(input_ids, attention_mask=attention_mask)[0]
x = SpatialDropout1D(0.2)(bert_output)
x = Conv1D(64, 3, activation=âreluâ)(x)
x = Bidirectional(LSTM(64, dropout=0.2))(x)
x = Dense(256, activation=âreluâ)(x)
x = Dropout(0.4)(x)
x = Dense(128, activation=âreluâ)(x)
outputs = Dense(1, activation=âsigmoidâ)(x)
model = Model(inputs=[input_ids, attention_mask], outputs=outputs)
return model
model = bert_model()
adam_optimizer = tf.keras.optimizers.Adam(learning_rate=1e-5)
model.compile(loss=âbinary_crossentropyâ, optimizer=adam_optimizer, metrics=[âaccuracyâ])
model.summary()
history = model.fit(
train_dataset,
epochs=3,
verbose=1
)
I get this error
TypeError: Exception encountered when calling layer âembeddingsâ (type TFBertEmbeddings).
Could not build a TypeSpec for name: âtf.debugging.assert_less_24/assert_less/Assert/Assertâ
op: âAssertââŚ
Please help out. Thanks