How to set model_path to another directory?

Hello! I am trying to follow these instructions to have both GPU and CPU offloading

Below is the code I am using

Set the quantization config with llm_int8_enable_fp32_cpu_offload set to True

quantization_config = BitsAndBytesConfig(llm_int8_enable_fp32_cpu_offload=True)

device_map = {
“transformer.word_embeddings”: 0,
“transformer.word_embeddings_layernorm”: 0,
“lm_head”: “cpu”,
“transformer.h”: 0,
“transformer.ln_f”: 0,

model_path = “decapoda-research/llama-7b-hf”
model_8bit = AutoModelForCausalLM.from_pretrained(

my problem is that it asks for the model_path to link to hugging face and a specific model. I also have these models downloaded but in another path:

When I run the code above it downloads the model again. How do I get the model_path to look at my already downloaded models?

Even when I let the download run, I get this error

Traceback (most recent call last):
File “C:\Windows\System32\text-generation-webui\”, line 33, in
model_8bit = AutoModelForCausalLM.from_pretrained(
File “C:\Users\justi\miniconda3\envs\textgen\lib\site-packages\transformers\models\auto\”, line 471, in from_pretrained
return model_class.from_pretrained(
File “C:\Users\justi\miniconda3\envs\textgen\lib\site-packages\transformers\”, line 2643, in from_pretrained
) = cls._load_pretrained_model(
File “C:\Users\justi\miniconda3\envs\textgen\lib\site-packages\transformers\”, line 2966, in _load_pretrained_model
new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
File “C:\Users\justi\miniconda3\envs\textgen\lib\site-packages\transformers\”, line 662, in _load_state_dict_into_meta_model
raise ValueError(f"{param_name} doesn’t have any device set.")
ValueError: model.layers.0.self_attn.q_proj.weight doesn’t have any device set.