Error: Pretrained_model_name_or_path cannot be None

System Info

  • transformers version: 4.33.2
  • Platform: Linux-5.4.0-147-generic-x86_64-with-glibc2.31
  • Python version: 3.11.4
  • Huggingface_hub version: 0.16.4
  • Safetensors version: 0.3.1
  • Accelerate version: 0.23.0
  • Accelerate config: not found
  • PyTorch version (GPU?): 2.0.1+cu118 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: yes
  • Using distributed or parallel set-up in script?: yes

I have a pre-trained model that I want to load it from its state-dict. I use the following code:

model_class = getattr(transformers, 'XLMRobertaForSequenceClassification')
model = model_class.from_pretrained(
        pretrained_model_name_or_path=None,
        config=model_type,
        num_labels=num_classes,
        state_dict=state_dict,
   )

This code was working on transformer version 4.22.1. However, after upgrading to version 4.33.2, I get this error:

Traceback (most recent call last):
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 261, in hf_raise_for_status
    response.raise_for_status()
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/None/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/transformers/utils/hub.py", line 429, in cached_file
    resolved_file = hf_hub_download(
                    ^^^^^^^^^^^^^^^^
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1195, in hf_hub_download
    metadata = get_hf_file_metadata(
               ^^^^^^^^^^^^^^^^^^^^^
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1541, in get_hf_file_metadata
    hf_raise_for_status(r)
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 293, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65146a5e-2beb5c500637678b292e6559;39635aa7-8f65-42b4-8320-0877d5013995)

Repository Not Found for url: https://huggingface.co/None/resolve/main/config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/code/my_model.py", line 239, in <module>
    model, labels = load_model(args.model)
                    ^^^^^^^^^^^^^^^^^^^^^^
  File "/code/my_model.py", line 105, in load_model
    model = MyModel('my_model_name', device='cuda')
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/code/model_loader.py", line 111, in __init__
    self.model, self.tokenizer, self.class_names = load_checkpoint(
                                                   ^^^^^^^^^^^^^^^^
  File "/code/model_loader.py", line 100, in load_checkpoint
    model, tokenizer = get_model_and_tokenizer(
                       ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/code/model_loader.py", line 66, in get_model_and_tokenizer
    model = model_class.from_pretrained(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2389, in from_pretrained
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "/home/raha/.conda/envs/myenv/lib/python3.11/site-packages/transformers/utils/hub.py", line 450, in cached_file
    raise EnvironmentError(
OSError: None is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`