pyNVML
is kinda terrible but if you can’t access the GPU using pyNVML
then the problem is with your Python / Jupyter, not torch
or any of the libraries that build on it. !nvidia-smi
is not enough to verify that since the command is just sent directly to bash.
Here is the website.
EDIT: But it sounds like the GPU is busy. Are you sure you don’t have another Jupyter session running? Even if it is done training it won’t release the GPU until you shut down the kernel.