Wrror while accessing pretrained model with auth token

AutoModelForSeq2SeqLM.from_pretrained(“meta-llama/Llama-2-7b-hf”, token=auth_token)
For above line getting error like below.
Please help on resolving this issue


ValueError Traceback (most recent call last)
Cell In[11], line 23
12 # # Create tokenizer
13 # tokenizer = AutoTokenizer.from_pretrained(name,
14 # cache_dir=‘./model/’, token=auth_token)
(…)
20
21 # Load the tokenizer and model
22 tokenizer = AutoTokenizer.from_pretrained(“meta-llama/Llama-2-7b-hf”, token=auth_token)
—> 23 model = AutoModelForSeq2SeqLM.from_pretrained(“meta-llama/Llama-2-7b-hf”, token=auth_token)

File ~/demo_notebook/notebookenv/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py:564, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
560 model_class = _get_model_class(config, cls._model_mapping)
561 return model_class.from_pretrained(
562 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
563 )
→ 564 raise ValueError(
565 f"Unrecognized configuration class {config.class} for this kind of AutoModel: {cls.name}.\n"
566 f"Model type should be one of {', '.join(c.name for c in cls._model_mapping.keys())}."
567 )

ValueError: Unrecognized configuration class <class ‘transformers.models.llama.configuration_llama.LlamaConfig’> for this kind of AutoModel: AutoModelForSeq2SeqLM.
Model type should be one of BartConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, EncoderDecoderConfig, FSMTConfig, GPTSanJapaneseConfig, LEDConfig, LongT5Config, M2M100Config, MarianConfig, MBartConfig, MT5Config, MvpConfig, NllbMoeConfig, PegasusConfig, PegasusXConfig, PLBartConfig, ProphetNetConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SwitchTransformersConfig, T5Config, UMT5Config, XLMProphetNetConfig.