The doc suggests that installing with the commands:
pip install 'transformers[torch]'
uv pip install 'transformers[torch]'
will get a CPU-only install (I don’t have a GPU). So why does it have to take >2GB of my disk space for CUDA-specific libraries? especially if I’m going to run this in a docker-type environment, I’m interested to know if it’s possible to install without the GBs of CUDA libraries. If that breaks the transformers functionality, I would be interested in editing the docs accordingly.
I do realize that it’s getting installed because of the torch, not because of transformers itself, but it would be nice to know if there’s a way to slim this down when it’s not needed.