Note: The fine-tuned model is locally saved.
Notebook link
OSError: llama-2-7b-mlabonne-enhanced does not appear to have a file named config.json. Checkout ‘https://huggingface.co/llama-2-7b-mlabonne-enhanced/None’ for available files.
Note: The fine-tuned model is locally saved.
Notebook link
OSError: llama-2-7b-mlabonne-enhanced does not appear to have a file named config.json. Checkout ‘https://huggingface.co/llama-2-7b-mlabonne-enhanced/None’ for available files.
when finetuning a model… depending on the code it saves the adapter version of it… you have to merge the adapter with the original model.
idk how you trained it, but for Lora is called merge_adapters()