I don't think the GPU is available

I am on the “Fine-tuning a model with the Trainer API” step of Chapter 3 of the NLP Course.

I read, somewhere earlier, that Google Colab makes a GPU available for free. At the point where the Trainer is invoked, the tutorial says, “This will start the fine-tuning (which should take a couple of minutes on a GPU)”. When I tried this step in the Colab notebook, it took several hours.

Does this mean that the GPU is no longer free in Colab, or is there a missed step that I need to know about?

Nb. I am familiar with Jupyter notebooks but am fairly new to Colab and a total LLM newbie.

Hi @AltShift!
You can change the run time environment of a colab notebook as explained in this video.

When connected to the new runtime you can check if cuda is available with the following code snipped

import torch
print(torch.cuda.is_available())

alternatively you can define a device if you need to pass it:

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

print(device)

Thanks @CKeibel !

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.