Hugging Face Forums
Retraining a peft model after loading
Beginners
John6666
February 15, 2025, 7:56am
3
load_in_8bit=True, # Load in 8-bit precision
Maybe it’s because of the quantization.
show post in topic
Related topics
Topic
Replies
Views
Activity
Loading Peft model from checkpoint leading into size missmatch
🤗Transformers
6
10910
February 7, 2024
Peft Model For SequenceClassification failing _is_peft_model
Beginners
0
933
February 11, 2024
Retraining peft model
Intermediate
3
2995
March 1, 2024
I used to have no problem with PEFT fine-tuning after hundreds of trainings, but now I have encountered The error RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
🤗AutoTrain
2
45
October 1, 2025
Facing error while adding multiple adapters to a model
🤗Transformers
1
804
July 8, 2024