OSError: HuggingFaceM4/idefics2-8b does not appear to have a file named config.json

I try to use idefics2 with the example code in its model card. I can load the HuggingFaceM4/idefics2-8b-base model without a problem. However, I get the following error when using HuggingFaceM4/idefics2-8b. What could be the issue?
Tried transformer version 4.40.0, 4.40.2, and 4.41.1 and encountered the same error.

The full error log is attached:

Traceback (most recent call last):
File “/export/home/xxx.py”, line 19, in
processor = AutoProcessor.from_pretrained(“HuggingFaceM4/idefics2-8b”)
File “/root/miniconda3/envs/eval/lib/python3.10/site-packages/transformers/models/auto/processing_auto.py”, line 287, in from_pretrained
config = AutoConfig.from_pretrained(
File “/root/miniconda3/envs/eval/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py”, line 934, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File “/root/miniconda3/envs/eval/lib/python3.10/site-packages/transformers/configuration_utils.py”, line 632, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File “/root/miniconda3/envs/eval/lib/python3.10/site-packages/transformers/configuration_utils.py”, line 689, in _get_config_dict
resolved_config_file = cached_file(
File “/root/miniconda3/envs/eval/lib/python3.10/site-packages/transformers/utils/hub.py”, line 370, in cached_file
raise EnvironmentError(
OSError: HuggingFaceM4/idefics2-8b does not appear to have a file named config.json. Checkout ‘https://huggingface.co/HuggingFaceM4/idefics2-8b/tree/None’ for available files.

@VictorSanh @Leyo Any suggestions? Thanks!