I followed the example in https://github.com/huggingface/peft/blob/main/examples/sequence_classification/LoRA.ipynb
the only difference is that my base model is that my base model is biogpt and this example uses roberta.
When loading the model like this:
peft_model_id = "Lukee4/biogpt-2019"
config = PeftConfig.from_pretrained(peft_model_id)
inference_model = AutoModel.from_pretrained(config.base_model_name_or_path)
tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)
inference_model = PeftModel.from_pretrained(inference_model, peft_model_id)
I get the error: ValueError: Target modules [‘c_attn’] not found in the base model. Please check the target modules and try again.
My config is available on the hub: “Lukee4/biogpt-2019”