Yes, PEFT is not dependent on the usage of Trainer or SFTTrainer. Here is just one example of using LoRA with a custom training loop:
Depending on what you would like to do, custom training loops can get a bit complicated (think: mixed precision, gradient accumulation, etc.) but it’s absolutely possible.