Download models for local loading

how can I download the models from huggingface directly in my specified local machine directroy rather it downloads automatically into cached location.

model_id =  "meta-llama/Llama-2-7b-chat-hf"
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id,
                                             device_map="auto")
2 Likes