BERT for text classification

Hi. I am trying to use BERT with CNNBiLSTM for text classification but seems to be having an incompatibility issue with the transformer and TensorFlow. I used the latest version of TensorFlow version: 2.15.0 and Transformers version: 4.38.2. But once I run this:
from transformers import BertTokenizer, TFBertForSequenceClassification

model = TFBertForSequenceClassification.from_pretrained(‘bert-base-uncased’)
tokenizer = BertTokenizer.from_pretrained(‘bert-base-uncased’)

def bert_model(model_name=‘bert-base-uncased’, max_length=50):
bert_encoder = TFBertModel.from_pretrained(model_name)
input_ids = Input(shape=(max_length,), dtype=tf.int32, name=“input_ids”) # Correct variable name used here
attention_mask = Input(shape=(max_length,), dtype=tf.int32, name=“attention_mask”)
bert_output = bert_encoder(input_ids, attention_mask=attention_mask)[0]
x = SpatialDropout1D(0.2)(bert_output)
x = Conv1D(64, 3, activation=‘relu’)(x)
x = Bidirectional(LSTM(64, dropout=0.2))(x)
x = Dense(256, activation=‘relu’)(x)
x = Dropout(0.4)(x)
x = Dense(128, activation=‘relu’)(x)
outputs = Dense(1, activation=‘sigmoid’)(x)
model = Model(inputs=[input_ids, attention_mask], outputs=outputs)
return model

model = bert_model()
adam_optimizer = tf.keras.optimizers.Adam(learning_rate=1e-5)
model.compile(loss=‘binary_crossentropy’, optimizer=adam_optimizer, metrics=[‘accuracy’])

model.summary()

history = model.fit(
train_dataset,
epochs=3,
verbose=1
)

I get this error
TypeError: Exception encountered when calling layer ‘embeddings’ (type TFBertEmbeddings).

Could not build a TypeSpec for name: “tf.debugging.assert_less_24/assert_less/Assert/Assert”
op: “Assert”…

Please help out. Thanks