Using second gpu?

I have 2 gpus. Identical 3070 ti. I want my Gradio Stable Diffusion HLKY webui to run on gpu 1, not 0.

Couldn’t find the answer anywhere, and fiddling with every file just didn’t work.

1 Like

Hi @Telecino! Setting the pipeline’s device to cuda:1 should move it to your second GPU:

from diffusers import StableDiffusionPipeline

pipe = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", use_auth_token=True)
pipe = pipe.to("cuda:1")
2 Likes

I think, latest version supports running with --gpu 1 parameter.

Hey Anton :slight_smile: please, tell me… where exactly (or suggestion) would I add these 3 lines of code?

Hey Nidkal!

Since i also installed the facefixer and upscaller, in ‘scripts/relauncher.py’ I added

additional_arguments = “–gpu 1 --esrgan-gpu 1 --gfpgan-gpu 1”

When I Generate, i get the error Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0! (when checking argument for argument index in method wrapper__index_select)

I get the same error when I edit these 3 values in webui.py setting them as default=1
I get the same error when I only put --gpu 1
I did all permutations of the above.

Any idea?

Hi @Telecino! I believe you are using the sd-webui project, did you try to check their discussions page or their discord server? These forums are for the diffusers library :slight_smile:

1 Like

Both TensorFlow and PyTorch has a method to limit which GPUs are visible.

For example, os.environ["CUDA_VISIBLE_DEVICES"]="1"