this is a autotrained model i created based on llama 3.3 70B, but the run locally on ollama option does not show up although it is selected in the list.
1 Like
Isn’t the library name peft instead of transformers?
If you created LoRA or Adapter in training, peft is correct, but if not, you should specify transformers.