Combine LORA with full finetuning

Hi, I added some custom layers to a LLM and I would like to train all the custom parameters fully but use LORA for the LLM parameters. Is it possible to combine LORA in some layers and full finetuning in other layers?

Thank you