HuggingFace offline error

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like custom-model
 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

I have ensure to set:

HUGGINGFACE_HUB_CACHE to the dir /models

inside /models:
contains custom-model which is a folder containing:

blobs config.json refs snapshots

When using .from_pretrained(), try to specify the param trust_remote_code=True and not to set local_file_only=True. In addition, set the HF_ENDPOINT with ‘https://hf-mirror.com’. For example:

# Set the hub_cache dir
os.environ['HF_HOME'] = '/data/shared_models/'
#   can also be done by 
#   os.environ['TRANSFORMERS_CACHE'] = '/data/shared_models/'

# set mirror
os.environ['HF_ENDPOINT'] = 'https://hf-mirror.com'

# load model by model_id
model_id = "Qwen/Qwen2-72B-Instruct"
model = AutoModelForCausalLM.from_pretrained(
    model_name, 
    device_map="auto", 
    torch_dtype=torch.bfloat16, 
    trust_remote_code=True,  # without specify local_file_only
)
1 Like