Can I dynamically add or remove LoRA weights in the transformer library like diffusers

Hi HF community,

I see that in the diffuser library, there is this feature to dynamically add and remove LoRA weights based on this article blog/lora-adapters-dynamic-loading.md at main · huggingface/blog · GitHub and using the load_lora_weights and fuse_lora_weights. I want to know if I can do something similar with LoRA for transformers too?

Hi!

Yes, with PEFT library you can merge and unmerge LORA adapters to any model. See docs for more information

1 Like

Thank you for your help!

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.