Keyerror when trying to download GPT-J-6B checkpoint

model = AutoModelForCausalLM.from_pretrained(“EleutherAI/gpt-j-6B”, torch_dtype=torch.float32)

results in a

keyerror ‘gptj’

when attempting to download the checkpoint. running transformers library version 4.10.2

similar to topic:

github pull request seems to have been merged already.

GPT-J has been merged and is part of version 4.11, so you should be able to update your transformers version to the latest one and use GPT-J.

I see, thanks Bram! did the trick