Trouble training LoRA in Oobabooga

Hey. I hope this is the correct place to post this. First time poster.
I’m sorry if I’m being totally stupid for not figuring this out, but I’m at a loss. :slight_smile:
I’ve recently downloaded Oogabooga and The Llama 2 13B chat GPTQ model from TheBloke.
Now I’m experimenting with training LoRAs, and I get this error when trying to train:

File “C:\AI\Oogabooga\text-generation-webui-main\text-generation-webui-main\modules\training.py”, line 545, in do_train

lora_model = get_peft_model(shared.model, config)

             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File “C:\AI\Oogabooga\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\peft\mapping.py”, line 133, in get_peft_model

return MODEL_TYPE_TO_PEFT_MODEL_MAPPING[peft_config.task_type](model, peft_config, adapter_name=adapter_name)

       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File “C:\AI\Oogabooga\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\peft\peft_model.py”, line 1043, in init

super().__init__(model, peft_config, adapter_name)

File “C:\AI\Oogabooga\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\peft\peft_model.py”, line 125, in init

self.base_model = cls(model, {adapter_name: peft_config}, adapter_name)

                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File “C:\AI\Oogabooga\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\peft\tuners\lora\model.py”, line 111, in init

super().__init__(model, config, adapter_name)

File “C:\AI\Oogabooga\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\peft\tuners\tuners_utils.py”, line 90, in init

self.inject_adapter(self.model, adapter_name)

File “C:\AI\Oogabooga\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\peft\tuners\tuners_utils.py”, line 250, in inject_adapter

raise ValueError(

ValueError: Target modules {‘q_proj’, ‘v_proj’} not found in the base model. Please check the target modules and try again.

Yes I realize I called it oogabooga by mistake :smiley:

I’m also brand new to this and getting the same exact message. I think there’s a fundamental issue like not using a compatible model or the wrong target modules for the model but I can’t be sure. I hope someone can decode this error. Any luck OP? I’ll post an update if I figure it out too. Thanks