All version info can be found at the bottom of the post.
I have a bit of standalone Python code that I’m planning on putting into a ComfyUI custom node. The code I’m running is:
import rembg
from PIL import Image
img = Image.open("./lizard.png")
img_nobg = rembg.remove(img)
img_nobg.save("./lizard_nobg.png")
running this file from the command line produces the following output:
PS C:\Users\mallo\Coding\misc_scripts> python .\remove_background.py
2024-07-13 13:02:46.0723102 [E:onnxruntime:Default, provider_bridge_ort.cc:1731 onnxruntime::TryGetProviderInfo_TensorRT] D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\mallo\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_tensorrt.dll"
*************** EP Error ***************
EP Error D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:456 onnxruntime::python::RegisterTensorRTPluginsAsCustomOps Please install TensorRT libraries as mentioned in the GPU requirements page, make sure they're in the PATH or LD_LIBRARY_PATH, and that your GPU is supported.
when using ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
****************************************
2024-07-13 13:02:46.1426031 [E:onnxruntime:Default, provider_bridge_ort.cc:1745 onnxruntime::TryGetProviderInfo_CUDA] D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\mallo\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
When I start my ComfyUI server, though, it detects onnxruntime just fine:
PS C:\Users\mallo\ComfyUI> python .\main.py
...
[comfyui_controlnet_aux] | INFO -> Using ort providers: ['CUDAExecutionProvider', 'DirectMLExecutionProvider', 'OpenVINOExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider', 'CoreMLExecutionProvider']
DWPose: Onnxruntime with acceleration providers detected
...
I’ve been trying to fix this for about a week now and I have tried every possible permutation of different versions for everything, so any help would be greatly appreciated.
OS: Windows 11
Python:
PS C:\Users\mallo> python --version
Python 3.10.6
Torch/CUDA:
PS C:\Users\mallo> python -m pip show torch
Name: torch
Version: 2.1.2+cu118
Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration
Home-page: https://pytorch.org/
Author: PyTorch Team
Author-email: packages@pytorch.org
License: BSD-3
Location: c:\users\mallo\appdata\local\programs\python\python310\lib\site-packages
onnxruntime:
PS C:\Users\mallo> python -m pip show onnxruntime-gpu
Name: onnxruntime-gpu
Version: 1.18.1
Summary: ONNX Runtime is a runtime accelerator for Machine Learning models
Home-page: https://onnxruntime.ai
Author: Microsoft Corporation
Author-email: onnxruntime@microsoft.com
License: MIT License
Location: c:\users\mallo\appdata\local\programs\python\python310\lib\site-packages
rembg:
PS C:\Users\mallo\Coding\misc_scripts> python -m pip show rembg
Name: rembg
Version: 2.0.57
Summary: Remove image background
Home-page: https://github.com/danielgatis/rembg
Author: Daniel Gatis
Author-email: danielgatis@gmail.com
License: UNKNOWN
Location: c:\users\mallo\appdata\local\programs\python\python310\lib\site-packages