Adapter_model.safetensors size is very big

I trained a model using the tutorial below

Here the Kaggle Code

they mentioned its generate a adapter_model.safetensors at 160+MB size but when I run the same script its giving the 2.27GB.

But when I use the unsloth It can able to produce the model in the 160+MB size. But I don’t want to use the Unsloth. Here the Unsloth Fine-tune code. (You can find in the reply)

After changing the version of trl & peft the issue is resolved

%pip install -U peft==0.11.1
%pip install -U "trl<0.9.0"

Fore more reference

Please check the below Kaggle discussion.

When using the meta-llama/Meta-Llama-3-8B-Instruct or meta-llama/Meta-Llama-3.1-8B-Instruct both are giving the 2.27GB of adapter_model.safetensors.

Here the package I used.

%%capture
%pip install -U transformers 
%pip install -U datasets 
%pip install -U accelerate 
%pip install -U peft==0.11.1
%pip install -U "trl<0.9.0"
%pip install -U bitsandbytes 
%pip install -U wandb

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.