Why does installing "CPU-only version of Transformers" install multiple GB of CUDA libs?

The doc suggests that installing with the commands:

pip install 'transformers[torch]'
uv pip install 'transformers[torch]'

will get a CPU-only install (I don’t have a GPU). So why does it have to take >2GB of my disk space for CUDA-specific libraries? especially if I’m going to run this in a docker-type environment, I’m interested to know if it’s possible to install without the GBs of CUDA libraries. If that breaks the transformers functionality, I would be interested in editing the docs accordingly.

I do realize that it’s getting installed because of the torch, not because of transformers itself, but it would be nice to know if there’s a way to slim this down when it’s not needed.

1 Like

The Transoformers library also works with PyTorch for CPUs. However, if you install CUDA and then run pip install torch, the CUDA version will be installed. I think you can make it slimmer by installing PyTorch for CPU first somehow, and then installing Transoformers with pip install transoformers.
https://stackoverflow.com/questions/78947332/how-to-install-torch-without-nvidia
https://stackoverflow.com/questions/51730880/where-do-i-get-a-cpu-only-version-of-pytorch

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.