I’m getting the same error, when trying to finetune Llama-7b using PEFT and prefix-tuning. When I change it to LoRA, there is no error
I’m getting the same error, when trying to finetune Llama-7b using PEFT and prefix-tuning. When I change it to LoRA, there is no error