Hello everyone,
I am encountering an error that has been posted several times on different forums, but none of the proposed solutions work in my case. So, I am reposting the error:
OSError: We couldnât connect to âhttps://huggingface.coâ to load this file, couldnât find it in the cached files and it looks like meta-llama/Meta-Llama-3-8B-Instruct is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at âInstallationâ.
The code causing the error is as follows:
Load the model and tokenizer from Hugging Face
model_name = âmeta-llama/Meta-Llama-3-8B-Instructâ
tokenizer = AutoTokenizer.from_pretrained(model_name, token=hf_token)
model = AutoModelForSequenceClassification.from_pretrained(model_name, token=hf_token)
My connection is working correctly; could there be some connection parameters causing the issue?
Itâs problematic because I absolutely need to be able to infer with this model.
I am available for any further details you might need.
Thank you very much in advance,