How to add Sentence Bert to keras sequential model?

Purpose:

I am using universal sentence encoder in keras Sequential model and it work for me but i want to change USE with Sentence BERT ( Hugging Face Sentence BERT and Sentence BERT ):

Model Architecture:

module_url = "https://tfhub.dev/google/universal-sentence-encoder/4"
embed_model= hub.load(module_url)


model = Sequential([
        Input(shape=[], dtype=tf.string),

        embed_keras_layer = hub.KerasLayer(embed_model, trainable=True, name='USE_embedding'),

        Dense(64, activation='relu'),

        BatchNormalization(),

        Dropout(0.5),

        Dense(32, activation='relu'),

        BatchNormalization(),

        Dropout(0.5),

        Dense(2, activation='sigmoid')
    ])

model.compile(optimizer=Adam(learning_rate=0.0001),
                     loss='binary_crossentropy', metrics=[get_f1, 
                     tf.keras.metrics.BinaryAccuracy(), tf.keras.metrics.Precision(), 
                     tf.keras.metrics.Recall(),'accuracy', tfa.metrics.F1Score(
                     num_classes=2,
                     average=None,
                     name="F1-score_")])

How to use Sentence Bert instead of embed_keras_layer(USE)

i tried different ways but it giving me error and as kears using it own hub.kerasLayer but Sentence BERT is difficult to handle please any help would be appreciated thanks

# -----how to use this transformer in sequential layer-----

from sentence_transformers import SentenceTransformer
model = SentenceTransformer('paraphrase-MiniLM-L6-v2')

#Sentences we want to encode. Example:
sentence = ['This framework generates embeddings for each input sentence']


#Sentences are encoded by calling model.encode()
embedding = model.encode(sentence)