Unrecognized configuration class <....LlamaConfig> for AutoModelForSeq2SeqLM

Hi, the subject is a summary of the actual error trying to use TheBloke/OpenAssistant-Llama2-13B-Orca-8K-3319-GPTQ’.

Unrecognized configuration class <class ‘transformers.models.llama.configuration_llama.LlamaConfig’> for this kind of AutoModel: AutoModelForSeq2SeqLM

my code…

Install necessary library

!pip install transformers

Import necessary modules

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

Specify the model name

model_name = ‘TheBloke/OpenAssistant-Llama2-13B-Orca-8K-3319-GPTQ’ # replace this with your actual model name

Load the model and the tokenizer

tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)

The error:

ValueError                                Traceback (most recent call last)
<ipython-input-4-a4bfd7aa96fc> in <cell line: 3>()
      1 # Load the model and the tokenizer
      2 tokenizer = AutoTokenizer.from_pretrained(model_name)
----> 3 model = AutoModelForSeq2SeqLM.from_pretrained(model_name)

/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    494                 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
    495             )
--> 496         raise ValueError(
    497             f"Unrecognized configuration class {config.__class__} for this kind of AutoModel: {cls.__name__}.\n"
    498             f"Model type should be one of {', '.join(c.__name__ for c in cls._model_mapping.keys())}."

ValueError: Unrecognized configuration class <class 'transformers.models.llama.configuration_llama.LlamaConfig'> for this kind of AutoModel: AutoModelForSeq2SeqLM.
Model type should be one of BartConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, EncoderDecoderConfig, FSMTConfig, GPTSanJapaneseConfig, LEDConfig, LongT5Config, M2M100Config, MarianConfig, MBartConfig, MT5Config, MvpConfig, NllbMoeConfig, PegasusConfig, PegasusXConfig, PLBartConfig, ProphetNetConfig, SwitchTransformersConfig, T5Config, UMT5Config, XLMProphetNetConfig.
1 Like

Encountered the same issue. Seems AutoModelForSeq2SeqLM and LlamaConfig do not support each other.

The problem is that Tensorflow doesnt support AutoModelForSeq2SeqLM and you gotta install PyTorch instead of TensorFlow. Try unistalling Tensorflow and install PyTorch

1 Like