Is it possible to add linear layers before lm_head in Text Generation models?


For my use case, I want to add a few linear layers before the lm_head layer (AutoModelForSeq2SeqLM or AutoModelForCausalLM) while still retaining the capability to call the generate function. I also want this model to be compatible with multiple configs such as Bart, T5, and GPT2.
Is there any easy way to achieve this?

Any kind of help would be much appreciated.