Fined tuned model not deployed with end-points, config.json file missing

Hi,
I have fined-tuned a llama-2-7b model on a text generation task using Colab Pro, GPU tested:: T4, and A100.

The instruction I used to push the model to the hub is: model.push_to_hub(“amr_my_llm_finetuned”)

Here is my model on the hub: royam0820/amr_my_llm_finetuned.
Here is the deployment endpoints: aws-amr-my-llm-finetuned-6755

During the deployment I have this error: OSError: /repository does not appear to have a file named config.json. Indeed, this file is missing.

Please let me know how to fix this issue. Can I simply add a config.json file from abhishek/llama-2-7b-hf-small-shards?

Thanks in advance for your help and support.

Anne-Marie

1 Like

I also have a the exact same issue. Did you figure out how to resolve this error?