Finetuned model of Codellam

Hi I am trying to load a fine-tuned model from Codellama. When I perform this
model_name =“natankatz/codellama2”
config = AutoConfig.from_pretrained(model_name, output_hidden_states=True)
model = LlamaForCausalLM.from_pretrained( pretrained_model_name_or_path=model_name,config=config )

I receive
Process finished with exit code 137 (interrupted by signal 9: SIGKILL)

However when I call the base codellma , or call the model from pipeline it works fine .
What am I doing wrong?