Hi, there,
I have a question about downloading LLM model.
Because of not accessing network, I cannot use huggingface hub to download LLM model.
My thinking is download the whole repo of LLM firstly.
I use git clone https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct
But the repo is too large, if make sure the model run, what file I must download?
Thank you very much
You need all files . The repo isn’t so large for 8b parameter model. You might use zip to turn it to one file and bring it to your local system.
You can use Artifactory as a proxy server to huggingface.
If you dont have access to the network but your Artifactory does, you can create an Huggingface remote repository that points to huggingface and download the model via Artifactory as a proxy server
I don’t know what the Artifactory is. It means Artificatory needs to connect external network? Right?