Hi @alkzar90 !
Might be because you have to accept the license prior to using the model with the inference api?
I wonder if passing in your huggingface token to gr.Interface.load
will fix it.
Hi @alkzar90 !
Might be because you have to accept the license prior to using the model with the inference api?
I wonder if passing in your huggingface token to gr.Interface.load
will fix it.