Optimum onnx-gpu not working inside a docker container

I was trying to make an inference container using optimum and onnxruntime-gpu. It works well in my machine (T4), but I face this error when I do it inside a docker container:
Asked to use CUDAExecutionProvider as an ONNX Runtime execution provider, but the available execution providers are ['AzureExecutionProvider', 'CPUExecutionProvider'].

I am using this container as a starting point: nvidia/cuda:12.2.2-base-ubuntu22.04

Also I don’t see docker images for optimum with cuda 12.x on dockerhub: Docker

Hi @rumbleFTW! Could you try with a Docker image based on CUDA 11.x?
Or you’ll need to install ONNXRuntime with

https://onnxruntime.ai/docs/install/#install-onnx-runtime-gpu-cuda-12x

for compatibility with CUDA 12.x. See more here.

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.