Hugging Face Forums
LoRA Adapter Loading Issue with Llama 3.1 8B - Missing Keys Warning
Beginners
John6666
March 31, 2025, 11:12am
2
I think this is probably a warning related to this specification.
show post in topic
Related topics
Topic
Replies
Views
Activity
Lora: missing adapter keys while loading the checkpoint
Intermediate
2
635
January 6, 2025
Using LoRA Adapters
Beginners
0
2102
January 24, 2024
`get_peft_model` or `model.add_adapter`
Beginners
2
1044
February 17, 2025
Loading an LoRA adapter trained on quantized model on a non-quantized model
Intermediate
0
1340
November 7, 2023
Issue with LoRA Adapter Loading on Multiple GPUs during Fine-Tuning with Accelerate and SFTTrainer
🤗Accelerate
3
835
September 18, 2024