How to use custom positional embedding while fine tuning Bert

Hi, I want to fine-tune Bert for question answering but instead of using the built-in position embeddings (which are the indices of the tokens), I want to add my own positional embedding (for example the sinusoidal positional embedding mentioned in the paper “Attention is all you need”). I understand one way to do that would be to modify the input embeddings in the BertModel class, but I am not sure how to modify that class. If there are better ways, I would be happy to follow. Any suggestions would be highly appreciated.

Did you solve this problem? I’m trying to solve same thing.

All you need is change BertEmbeddings class, you can build a new class and inherit BertEmbeddings.
You can follow the post.(I wrote the solution in it.)