Not able to save and test my finetuned model

Hi everyone,

I’m pretty new in the community and I don’t understand all of what I’m doing.

I try to finetune Falcon-7b with a custom dataset on a data type like:
{ ‘question’: ´string’, ´answer´:’string’}

I use Lora method (as I understood with peft 0.5.0). I save my trained model with:

trainer.save_model(path)

After 3h of training, I have 4 files:
adapter_config.json, adapter_model.bin, README.md and training_args.bin

I tried to use this model with oobabooga/text-generation-webui but it never worked. My bin files are very light in comparaison with falcon-7b base model.
I don’t understand, can someone help me with this ? :innocent: