Rope Factor issues with meta-llama/Meta-Llama-3.1-70B

with this code in Colabs : # Load model
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.float16,
device_map=“auto”,
use_auth_token=True,
rope_scaling={“type”: “dynamic”, “factor”: 8.0} # Ensure correct parameter format
)

I am getting this error message :

ValueError: rope_scaling must be a dictionary with two fields, type and factor, got {‘factor’: 8.0, ‘low_freq_factor’: 1.0, ‘high_freq_factor’: 4.0, ‘original_max_position_embeddings’: 8192, ‘rope_type’: ‘llama3’}

hi @kgbearder
Have you seen meta-llama/Meta-Llama-3.1-8B-Instruct · ValueError: `rope_scaling` must be a dictionary with two fields ?

A simple Google search gives the answer: ValueError: `rope_scaling` must be a dictionary with two fields, `type` and `factor` · Issue #299 · meta-llama/llama3 · GitHub

@nielsr @mahmutc thanks for the guidance . was resolved by :

pip install --upgrade transformers

and then using :

rope_scaling={“type”: “llama3”, “factor”: 8.0}