How to combine ReFT Modules to Base Model?

Hello.
I am trying to combine the ReFT modules and the Llama3-8B-Instruct pre-training model. I downloaded the Llama 3 model and now I want to incorporate the ReFT modules into this model. This is my load Llama3 and load ReFT model code:

import torch, transformers, pyreft

model_name = "meta-llama/Meta-Llama-3-8B-Instruct"
path = "Hamana0509/ReFT_Orpo_Llama3_8B_Instruct"
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

model = transformers.AutoModelForCausalLM.from_pretrained(
    model_name, torch_dtype=torch.bfloat16, device_map=device
)

reft_model = pyreft.ReftModel.load(path, model, from_huggingface_hub=True)
reft_model.set_device("cuda")

I get an error saying:

RuntimeError: Error(s) in loading state_dict for Linear:
	size mismatch for weight: copying a param with shape torch.Size([4, 4096]) from checkpoint, the shape in current model is torch.Size([2, 4096]).
	size mismatch for bias: copying a param with shape torch.Size([4]) from checkpoint, the shape in current model is torch.Size([2]).

My question is: Any suggestions for my issue?
This is my source code (I have re-implemented ORPOReftTrainer based on ORPOTrainer from trl): Google Colab
This is my ReFT modules repo: Hamana0509/ReFT_Orpo_Llama3_8B_Instruct · Hugging Face
Thanks a lot :hugs: