I am using the transformer’s trainer API to train a BART model on server. The GPU space is enough, however, the training process only runs on CPU instead of GPU.
I tried to use
numba like this example to add function decorators, but it still doesn’t help.
What is the reason of it using CPU instead of GPU? How can I solve it and make the process run on GPU?
Thank you for your help!