Transformers CLI tool: error: invalid choice: 'repo'

Hi, this is first time for me to share pre-trained weights.

When I followed the steps in https://huggingface.co/transformers/model_sharing.html, I got error below.

$ transformers-cli repo create <my model name>
2020-11-25 06:52:26.478076: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcudart.so.10.1'; dlerror: libcudart.so.10.1: cannot open shared object file: No such file or directory
2020-11-25 06:52:26.478102: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
PyTorch version 1.6.0 available.
TensorFlow version 2.3.0 available.
usage: transformers-cli <command> [<args>]
Transformers CLI tool: error: invalid choice: 'repo' (choose from 'convert', 'download', 'env', 'run', 'serve', 'login', 'whoami', 'logout', 's3', 'upload')

Is repo deprecated?

Thanks in advance.

I created a repo on the website and uploaded the model.
but it seems i can not load the model I just uploaded.

OSError                                   Traceback (most recent call last)
~/anaconda3/envs/fastaiv2/lib/python3.7/site-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
    355             if resolved_config_file is None:
--> 356                 raise EnvironmentError
    357             config_dict = cls._dict_from_json_file(resolved_config_file)

OSError: 

During handling of the above exception, another exception occurred:

OSError                                   Traceback (most recent call last)
<ipython-input-13-1f125ff93000> in <module>
      1 from transformers import AutoTokenizer, AutoModel
      2 
----> 3 tokenizer = AutoTokenizer.from_pretrained("kouohhashi/roberta_ja")
      4 
      5 model = AutoModel.from_pretrained("kouohhashi/roberta_ja")

~/anaconda3/envs/fastaiv2/lib/python3.7/site-packages/transformers/tokenization_auto.py in from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs)
    204         config = kwargs.pop("config", None)
    205         if not isinstance(config, PretrainedConfig):
--> 206             config = AutoConfig.from_pretrained(pretrained_model_name_or_path, **kwargs)
    207 
    208         if "bert-base-japanese" in str(pretrained_model_name_or_path):

~/anaconda3/envs/fastaiv2/lib/python3.7/site-packages/transformers/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
    297             {'foo': False}
    298         """
--> 299         config_dict, _ = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
    300 
    301         if "model_type" in config_dict:

~/anaconda3/envs/fastaiv2/lib/python3.7/site-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
    363                 f"- or '{pretrained_model_name_or_path}' is the correct path to a directory containing a {CONFIG_NAME} file\n\n"
    364             )
--> 365             raise EnvironmentError(msg)
    366 
    367         except json.JSONDecodeError:

OSError: Can't load config for 'kouohhashi/roberta_ja'. Make sure that:

- 'kouohhashi/roberta_ja' is a correct model identifier listed on 'https://huggingface.co/models'

- or 'kouohhashi/roberta_ja' is the correct path to a directory containing a config.json file

I can find my mode on https://huggingface.co/models

Thanks in advance.

Hi @kouohhashi, you need to upgrade to transformers > v3.5.x to get this command

1 Like

Thanks. It worked!