Download models for local loading

The model_id parameter can take a folder location, so if you find out where the model has been downloaded to, you can put that in instead of the model_id.

So if your model was downloaded to c:/llama-chat you would change the line to:
model = AutoModelForCausalLM.from_pretrained("c:/llama-chat", device_map="auto")

I hope that helps.