I am using the free version of inference API’s with a custom Stable Diffusion model. When trying to use the endpoint, I am unable to figure out how to set a custom size for the output (like how in the pipeline it’s just height=x and width=x)…
Hi @jetpackjules, the Inference API only accepts the prompt parameter such as inputs, and most recently we have also added support for negative_prompts. If you need access to extra params you can deploy it as a Space or a custom Inference Endpoints.