Using the prompt to switch model

yeah, regarding gradio.Interface.load()

import gradio
model1 = gradio.Interface.load(“models/ItsJayQz/Marvel_WhatIf_Diffusion”)

def process1(prompt):
image_return = model1(prompt)
return image_return

with gradio.Blocks() as nice_ui:
input_text = gradio.Textbox(label=“Enter Prompt Here”)
output_window = gradio.Image(label=“Produced Image”)
run_button = gradio.Button(label=“Run”)
run_button.click = (process1, input_text, output_window)
nice_ui.queue(concurrency_count=100)
nice_ui.launch()

Don’t quote me on this, I usually error a couple times before getting the syntax right, but this is pretty close.

1 Like

Thanks a lot, I will use this new code above
and the solution provided by @abidlabs

Hopefully, this will work faster

1 Like

I’m going to mess with this task as well, because I think I’ve tried before incorporating a dropdown with this method, and was having trouble. I think the dropdown needs to target a function that points to the model, rather than the model. Another option to keep on the board is gradio.Interface.from_pipeline(). I’ll copy you if you get it figured out xD

It worked! I had actually given up on using a dropdown to select gradio.Interface.load() some time ago. Following @abidlabs example and changing:

models = [
“models/Repo/Name”,
]
to:
models=[
gradio.Interface.load(“models/Repo/Name”),
]
made it happen. Thanks for asking this question and check it out:

1 Like

looks promising, I am not having the version working yet
I made the space private in the meantime to get faster result

Will make the space public once I’m done

1 Like

My code is full of remodels, but to refine it down was something like:

"
model_box=[
gr.Interface.load(“models/Repo1/Name1”),
gr.Interface.load(“models/Repo2/Name2”),
gr.Interface.load(“models/Repo3/Name3”),
]

current_model=model_box[0]

def the_process(input_text, model_choice):
a_variable=model_box[model_choice]
output=a_variable(input_text)
return(output)

with gr.Blocks() as demo:
model_choice = gr.Dropdown(choices=[m for m in model_box], type=“index”, value=current_model)
input_text=gr.Textbox()
output_window=gr.Image()
the_button=gr.Button()

the_button.click(the_process, inputs=[input_text,model_choice], outputs=[output_window]

demo.launch()
"

1 Like

I am finishing up, but it works now

Thanks a lot for your help, I am glad this is resolved
and I learned a lot in the last 2 days thanks to you

Here is the final version:
https://alstable-marvel.hf.space

As a sidenote - I had to get rid of one the style

Fetching model from: DGSpitzer/Guan-Yu-Diffusion · Hugging Face
as it was giving an error
ValueError: Unsupported pipeline type: None

This thread can now be closed

1 Like

here is a fresh example

1 Like

Right on, I learned a lot here too.
That example I just left is trash for anybody following along, and nobody should use it. XD
Here is an example space employing “Gradio Dropdown to Select Interface.load(models)” to remember this journey

1 Like