Generating consistent background using diffusion models

Hi everyone!

How to condition model to generate same background image?

I have foreground masks (same object) and I want to generate same room environment in this setup.

Is this possible and how?

1 Like

If the backgrounds are similar, it may be possible to generate them by fine-tuning the model, but if you want the same background, it would probably be quicker to use Inpainting.

There is a more advanced method called ControlNet, but it is difficult to use, so it is better to search for it first.

1 Like

Thank you for your feedback!

I actually tried inpainting techniques, starting from SD1.5, SDXL up to FLUX inpainting.

Going left to right in model names, my output got better and better. However, even with ControlNet and InPainting, I still have issue when trying to generate studio environment, with small changes like doors, windows, carpet, etc., which makes images look weird, as it is not product in the same room.

1 Like

If there is a need for a certain degree of consistency in the objects within the generated image, then a technology with a direction similar to Virtual Try-On may be suitable.:thinking: Because there are inevitably some ambiguous areas when using diffusion models alone, such as SD and Flux. Another possibility is a special ControlNet such as Flux Edit.