Text-embeddings-inference docker image fails to run

I am unable to docker run text-embeddings-inference docker images (I have tried several) in my local Docker environment.

  • Running H-100 GPUs,
  • Ubuntu host,
  • recent docker engine install, CUDA 12.4,
  • verified “LD_LIBRARY_PATH” exists and that it contains the proper directory,
  • verified directory is on PATH,
  • checked symbolic link libcuda.so.1 was created and linked to current version libcuda.so.550.127.05.
  • Symbolic ink and file confirmed to exist in proper directory (/usr/lib/x86_64-linux-gnu).
  • Able to access the file through the symbolic link with a non-elevated account.
  • Permissions on libcuda.so.550.127.05 are 644.
  • Used text-embeddings-inference:89-1.2 and 1.5 (and others).

Error received: “error while loading shared libraries: libcuda.so.1 : cannot open shared object file.”

1 Like

This is an error that appears when the CUDA toolkit is not installed or the path is not set up, but I think it’s installed…
Maybe the path reference is not working properly in a specific library.

Thank you for your response. I verified the CUDA toolkit is installed and the path works for a non-root account. I have two AI servers from Lambda Labs. The first server is running CUDA compilation tools 12.2.140 and the text-embeddings-inference container starts and runs fine. The second server is running 12.4.131 and fails to start with the error mentioned in the above post. Lambda Labs is also investigating.

1 Like