Hi,
I want to use TPU provided by Kaggle in my project. I use PyTorch XLA to do that
import torch_xla
import torch_xla.core.xla_model as xm
device = xm.xla_device()
Then I define a model
model = AutoModelForMaskedLM.from_pretrained("xlm-roberta-base")
And as I can see model on xla device, it fine
model.device # device(type='xla', index=1)
Then I define Trainer instance with my model
trainer = Trainer(
model,
args,
train_dataset=tokenized_datasets["train"],
eval_dataset=tokenized_datasets["validation"],
data_collator=data_collator,
tokenizer=tokenizer,
compute_metrics=compute_metrics,
)
And train it
trainer.train()
But it seems to me that trainer does not use xla device for TPU device is idle in Kaggle…
So, how to use TPU device in Kaggle with PyTorch XLA in my case?