Hello,
First day here.
After approved access to the repo and installing the huggingface-cli I tried :
( Copied this from the repo:meta-llama/Meta-Llama-3.1-8B-Instruct · Hugging Face)
huggingface-cli download meta-llama/Meta-Llama-3.1-8B-Instruct --include “original/*” --local-dir Meta-Llama-3.1-8B-Instruct
got an error:
An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
My internet connection is ok.
How to I proceed with this.
1 Like
hi @abenari0
What does whoami give?
huggingface-cli whoami
If it’s Not logged in
you can try to login with huggingface-cli login
Also, your request may be affected by firewall etc. You can curl to see if it is blocked by.
curl https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct/resolve/8c22764a7e3675c50d4c7c9a4edb474456022b16/original/consolidated.00.pth
it should give
Access to model meta-llama/Meta-Llama-3.1-8B-Instruct is restricted. You must be authenticated to access it.
1 Like
Hi, @mahmutc I am also facing the same issue, please help
1 Like
First, try login
and whoami
. If the situation remains unclear, suspect errors in the network or cache.