Loading my yolov5 model from wandb to Gradio

I am working on colab and trying to load a yolov5 model from my wandb, so that I can use it on Gradio UI. I tried: model = torch.hub.load(‘yolov5/wandb.run.dir’, ‘model.h5’) and other syntax without success. How do I load the model to use on the Gradio interface please?

Hello :wave:
You can pass your model to the function that is used by your interface like follows:

def infer(inputs):
# write your inference code here
model.predict(inputs)
return outputs

then do
gr.Interface(infer, ...).launch()

You can check out this blog post or Gradio docs.

1 Like