How to get "EleutherAI/gpt-j-6B" working?

I’m trying to run the EleutherAI/gpt-j-6B model, but with no luck. The code

model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-j-6B")

returns the following error:

Traceback (most recent call last):
  File "", line 18, in <module>
    model = AutoModelForCausalLM.from_pretrained("gpt-j-6B")
  File "/home/marcin/miniconda3/envs/py37/lib/python3.7/site-packages/transformers/models/auto/", line 383, in from_pretrained
    pretrained_model_name_or_path, return_unused_kwargs=True, **kwargs
  File "/home/marcin/miniconda3/envs/py37/lib/python3.7/site-packages/transformers/models/auto/", line 514, in from_pretrained
    config_class = CONFIG_MAPPING[config_dict["model_type"]]
  File "/home/marcin/miniconda3/envs/py37/lib/python3.7/site-packages/transformers/models/auto/", line 263, in __getitem__
    raise KeyError(key)
KeyError: 'gptj'

I’ve tried transformers version 4.9.2 as well as the latest 4.10.0.dev0 from github trunk. Apparently there is no model_type of type gptj. Do I need to add it somehow?

Hi, GPT-J-6B is not added yet to the library. It will be soon though: GPT-J-6B by StellaAthena · Pull Request #13022 · huggingface/transformers · GitHub

1 Like

Thanks for your answer! Thanks to you, I found the right fork and got it working for the meantime.

Maybe it would be beneficial to include information about the version of the library the models run with? (possibly an extension of the huggingface web interface`).

Hi, Thank you for linking the right fork! I am new to hugging face and I can’t figure out how to get it working as you did. Could you please point me in the right direction where I could learn how to do it?

this worked for me:

  1. uninstall previous version:
    pip uninstall transformers
  2. install the fork:
    pip install git+
  3. use the model:
    model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-j-6B")

just remember, this model needs 24GB memory.

1 Like