`gr.Interface.load(hugging_model_repo)` -> AssertionError: Invalid model name or src


I’m trying to use the gr.Interface.load() to create a quick demo from a ViT fine-tuned model repo, but there is an error during the building (here is the demo repo, also error below). I made another demo following the same process before (ViT + fine-tuning → model repo → gradio), and it works. Any help finding hints about the assertion error is welcome :blush:; the model is correctly fetching: Fetching model from: https://huggingface.co/alkzar90/skynet (and the repo URL is ok).

Fetching model from: https://huggingface.co/alkzar90/skynet
Traceback (most recent call last):
  File "app.py", line 5, in <module>
    iface = gr.Interface.load(name='alkzar90/skynet', src='models')
  File "/home/user/.local/lib/python3.8/site-packages/gradio/interface.py", line 108, in load
    return super().load(name=name, src=src, api_key=api_key, alias=alias, **kwargs)
  File "/home/user/.local/lib/python3.8/site-packages/gradio/blocks.py", line 1123, in load
    return external.load_blocks_from_repo(name, src, api_key, alias, **kwargs)
  File "/home/user/.local/lib/python3.8/site-packages/gradio/external.py", line 57, in load_blocks_from_repo
    blocks: gradio.Blocks = factory_methods[src](name, api_key, alias, **kwargs)
  File "/home/user/.local/lib/python3.8/site-packages/gradio/external.py", line 70, in from_model
    assert response.status_code == 200, "Invalid model name or src"
AssertionError: Invalid model name or src

The dataset is the only difference between model repos, but both models were trained using the transformer pipeline.


Hi @alkzar90 !

Might be because you have to accept the license prior to using the model with the inference api?

I wonder if passing in your huggingface token to gr.Interface.load will fix it.

1 Like

Hey @freddyaboulton, it works, thanks!
Adding that suggestion to the error message could be helpful :open_mouth:

gr.Interface.load(..., api_key='read_token')

1 Like