I’m trying to run the EleutherAI/gpt-j-6B model, but with no luck. The code
model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-j-6B")
returns the following error:
Traceback (most recent call last):
File "gptjtest.py", line 18, in <module>
model = AutoModelForCausalLM.from_pretrained("gpt-j-6B")
File "/home/marcin/miniconda3/envs/py37/lib/python3.7/site-packages/transformers/models/auto/auto_factory.py", line 383, in from_pretrained
pretrained_model_name_or_path, return_unused_kwargs=True, **kwargs
File "/home/marcin/miniconda3/envs/py37/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 514, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/home/marcin/miniconda3/envs/py37/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 263, in __getitem__
raise KeyError(key)
KeyError: 'gptj'
I’ve tried transformers version 4.9.2 as well as the latest 4.10.0.dev0 from github trunk. Apparently there is no model_type
of type gptj
. Do I need to add it somehow?