How to load local models

Hello.

I looked through the tutorial pages on https://www.gradio.app/.

It seems to me that gradio can launch the app with the models from huggingface.

Is it possible to load the model stored in local machine? If possible, could you tell me how to?

1 Like

I just solved, here an example:

import gradio as gr
from transformers import pipeline

model=“./models/mt5-small-finetuned-amazon-en-es”
summarizer = pipeline(“summarization”, model)

def summary(text):
return summarizer(text)[0][“summary_text”]

with gr.Blocks() as demo:
input_text = gr.Textbox(placeholder=“Ingresa la reseña del libro…”, lines=4)
output_text = gr.Textbox(label=“Resumen”)
btn = gr.Button(“Genera el resumen”)

btn.click(summary, input_text, output_text)

demo.launch()