ValueError: model.embed_tokens.weight doesn't have any device set

Hi, I am having this error and I don’t know what it means. Can someone explain?
This code triggers error:
model = LlamaForCausalLM.from_pretrained(
base_model,
#load_in_8bit=True,
torch_dtype=torch.float16,
device_map=device_map,
quantization_config=quantization_config,
)
device_map = {
“transformer.word_embeddings”: 0,
“transformer.word_embeddings_layernorm”: 0,
“lm_head”: “cpu”,
“transformer.h”: 0,
“transformer.ln_f”: 0,
}

quantization_config = BitsAndBytesConfig(llm_int8_enable_fp32_cpu_offload=True)

1 Like