Add per-word embedding from outer source to Bert embedding layer

I have an outer source with per-word embedding,
I want to add it as an additional later for the bert embddings:
Such that
custom_embedding_layer = (word_embeddings)+(position_embeddings)+(outersource_embeddings)

I saw the answer in:

But I would rather create a wrapper class rather then change the source code of BertEmbeddings itself
How can I do it?