I’m getting this error:
Traceback (most recent call last):
File “D:\PGRM\DecSD\diffusers\examples\inference\save_onnx.py”, line 66, in
convert_to_onnx(pipe.unet, pipe.vae.post_quant_conv, pipe.vae.decoder, text_encoder, height=512, width=512)
File “D:\PGRM\DecSD\diffusers\examples\inference\save_onnx.py”, line 41, in convert_to_onnx
traced_model = torch.jit.trace(unet, check_inputs, check_inputs=[check_inputs], strict=True)
File “C:\Users\MYNAME\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\jit_trace.py”, line 759, in trace
File “C:\Users\MYNAME\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\jit_trace.py”, line 976, in trace_module
RuntimeError: Encountering a dict at the output of the tracer might cause the trace to be incorrect, this is only valid if the container structure does not change based on the module’s inputs. Consider using a constant container instead (e.g. for
list, use a
tuple instead. for
dict, use a
NamedTuple instead). If you absolutely need this and know the side effects, pass strict=False to trace() to allow this behavior.
I have no idea how to solve it, and changing strict=True to False gave me a completely different error.
**I have already made a discussion post about this and several users said that they had the same problem. CompVis/stable-diffusion-v1-4 · Having an issue with AMD GPU on Windows