How to input word2vec embeddings to gpt2 model?

Hi,
I am working on the huggingface gpt2 model. I have a word2vec model trained on a dataset (dimensions similar to gpt2, 768). Now I want to input these embeddings to gpt2. I understand I must use inputs_embeds to input the embeddings but I am little unclear about how exactly to do it. My problem is I am unable to understand how exactly to code (Not very handy with complex coding). So I wanted to know if there were any examples which I can refer to to understand the finetuning part and understand how to add my word embeddings to gpt2 model. Any source or help would be appreciated.