Loading a trained model gives an error that weights are randomly initialized

Hello,

I have trained OPTCausalLM model finetuned and saved to disk in the following fashion.

trainer.save_model("peft-finetuned-10e")
model.config.to_json_file("peft-finetuned-10e/config.json")

When I try to load the model again, using the from_pretrained argument, I see this error.

config = AutoConfig.from_pretrained('./peft-finetuned-10e/config.json')
model = AutoModelForCausalLM.from_pretrained('./peft-finetuned-10e/', config=config, from_tf=False)

Here is the entire stacktrace

Some weights of the model checkpoint at ./peft-finetuned-10e/ were not used when initializing OPTForCausalLM: ['base_model.model.model.decoder.layers.14.fc2.bias', 
...
...
...
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.