I want to upload my model but I'm not sure what I'm doing wrong

I trained a model using Google Colab and now it’s finished. It’s a translator and would like to make it available here, however I assumed I would just need to download the checkpoint and upload that, but when I do and try to use the Inference API to test I get this error:

Could not load model myuser/mt5-large-es-nah with any of the following classes: (<class ‘transformers.models.mt5.modeling_mt5.MT5ForConditionalGeneration’>, <class ‘transformers.models.mt5.modeling_tf_mt5.TFMT5ForConditionalGeneration’>). See the original errors: while loading with MT5ForConditionalGeneration, an error is thrown: Traceback (most recent call last): File “/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py”, line 286, in hf_raise_for_status response.raise_for_status() File “/usr/local/lib/python3.11/site-packages/requests/models.py”, line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/myuser/mt5-large-es-nah/resolve/4de44c8119a027cc91beafa7c8011d58e2936cbf/tf_model.h5 The above exception was the direct cause of the following exception: Traceback (most recent call last): File “/src/transformers/src/transformers/utils/hub.py”, line 616, in has_file hf_raise_for_status(r) File “/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py”, line 323, in hf_raise_for_status raise RepositoryNotFoundError(message, response) from e huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65dade2d-0352f53a421300694b0ea885) Repository Not Found for url: https://huggingface.co/myuser/mt5-large-es-nah/resolve/4de44c8119a027cc91beafa7c8011d58e2936cbf/tf_model.h5. Please make sure you specified the correct repo_id and repo_type. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid username or password. During handling of the above exception, another exception occurred: Traceback (most recent call last): File “/src/transformers/src/transformers/pipelines/base.py”, line 278, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “/src/transformers/src/transformers/modeling_utils.py”, line 3432, in from_pretrained if has_file(pretrained_model_name_or_path, TF2_WEIGHTS_NAME, **has_file_kwargs): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “/src/transformers/src/transformers/utils/hub.py”, line 627, in has_file raise EnvironmentError(f"{path_or_repo} is not a local folder or a valid repository name on ‘https://hf.co’.“) OSError: myuser/mt5-large-es-nah is not a local folder or a valid repository name on ‘https://hf.co’. while loading with TFMT5ForConditionalGeneration, an error is thrown: Traceback (most recent call last): File “/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py”, line 286, in hf_raise_for_status response.raise_for_status() File “/usr/local/lib/python3.11/site-packages/requests/models.py”, line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/myuser/mt5-large-es-nah/resolve/4de44c8119a027cc91beafa7c8011d58e2936cbf/model.safetensors.index.json The above exception was the direct cause of the following exception: Traceback (most recent call last): File “/src/transformers/src/transformers/utils/hub.py”, line 616, in has_file hf_raise_for_status(r) File “/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py”, line 323, in hf_raise_for_status raise RepositoryNotFoundError(message, response) from e huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65dade2d-01ad42417da94e9415aa83a4) Repository Not Found for url: https://huggingface.co/myuser/mt5-large-es-nah/resolve/4de44c8119a027cc91beafa7c8011d58e2936cbf/model.safetensors.index.json. Please make sure you specified the correct repo_id and repo_type. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid username or password. During handling of the above exception, another exception occurred: Traceback (most recent call last): File “/src/transformers/src/transformers/pipelines/base.py”, line 278, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “/src/transformers/src/transformers/modeling_tf_utils.py”, line 2827, in from_pretrained if has_file(pretrained_model_name_or_path, SAFE_WEIGHTS_INDEX_NAME, **has_file_kwargs): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “/src/transformers/src/transformers/utils/hub.py”, line 627, in has_file raise EnvironmentError(f”{path_or_repo} is not a local folder or a valid repository name on ‘https://hf.co’.") OSError: myuser/mt5-large-es-nah is not a local folder or a valid repository name on ‘https://hf.co’.

not sure if I should export it differently or upload it as a zip or what am I missing? I’ve been looking here: Uploading models and also I’m doing it via web.

Hi,

In case you are using a model from the Transformers library, the easiest is just to use the push_to_hub method: Uploading models