Hi HF community,
I see that in the diffuser library, there is this feature to dynamically add and remove LoRA weights based on this article blog/lora-adapters-dynamic-loading.md at main · huggingface/blog · GitHub and using the load_lora_weights
and fuse_lora_weights
. I want to know if I can do something similar with LoRA for transformers too?