How do I add an input embedding layer to transformer

I pre-trained an input embedding layer and would like to add it prior to the transformer. How can I make it? The figure below can show my idea. All tutorials I found are to use a tokenizer to process the raw text source. It seems I do not need the tokenizer. Any help would be appreciated!
image

This is cool but I don’t know if I’m really understanding it. You’re wanting to add a Layer like in Keras? You might try something like this for embeddings.

Thank you @Hatman ! I would like to replace the embedding layer of transformer to my matrix. Other layers still load the pre-trained transformer weights.

Why wouldn’t you just fine-tune the pretrained model using your embeddings? Doing it manually would be tricky. I’m not sure if you’d want to dig into the actually .py’s to make it happen. I bet you could but I’d suggest going with something like Kera’s with more flexibility. I could be wrong. If not, this Keras tutorial might be what you’re looking for.