Hi I am trying to load a fine-tuned model from Codellama. When I perform this
config = AutoConfig.from_pretrained(model_name, output_hidden_states=True)
model = LlamaForCausalLM.from_pretrained( pretrained_model_name_or_path=model_name,config=config )
Process finished with exit code 137 (interrupted by signal 9: SIGKILL)
However when I call the base codellma , or call the model from pipeline it works fine .
What am I doing wrong?