DORA memory requirements

Hello Huggingface community! I am currently trying to use QDORA for finetuning Mistral-7B-Instruct. As per the documentation I have se the use_dora flag to true -
from peft import LoraConfig

config = LoraConfig(use_dora=True, …)
then why am I suddenly seeing an increase of 10GB during training when none of my training parameters have changed???