How to have custom output size for inference API

I am using the free version of inference API’s with a custom Stable Diffusion model. When trying to use the endpoint, I am unable to figure out how to set a custom size for the output (like how in the pipeline it’s just height=x and width=x)…

Is this possible?

Hi @jetpackjules, the Inference API only accepts the prompt parameter such as inputs, and most recently we have also added support for negative_prompts. If you need access to extra params you can deploy it as a Space or a custom Inference Endpoints.

Thanks for the quick reply!

Does this mean that there is no way to accomplish this for free?

You can try to find a Space with our community granted GPU that has an open API.

Not sure how to do that…

Also would it still work with a custom model?