Adding custom layer to GPT-2

Hi everyone,
I am looking for a way to modify GPT-2’s architecture slightly by inserting a custom feedforward layer inside a GPT-2 decoder block right after the masked self-attention sublayer. Is there a way to achieve this using Hugging Face’s GPT-2 model? I’m new to Hugging Face, any suggestions would be appreciated.
Thank you!