Transfer style of a certain image to another image using Diffusion models

Hello everyone!

I’m trying to transfer a style from a certain image to another certain image. I tried lots of different approaches and models, but all failed. I would like to save initial shape and add a style to it. Don’t suggest pytorch style transfer. It gives slightly different result (in my eyes it just combines two semi- transparent pictures with some minor changes). But I’m looking to a result where both image would diffuse in each other.

As there are no procces for merging 2 images, I also tried to fine tune different SD models with required style and run them on initial image with img2img proccess using my learned model token. It gives pretty messed output (or almost no changes using less denoising strenght).

Any ideas whats the best approach to reach that?

I haven’t tried train for inpainting model it. Not sure if it will be usefull.

Would be much appreciate for any help!