I am a newbie to huggingface/transformersâŚ
I tried to follow the instructions at https://huggingface.co/transformers/model_doc/roberta.html#robertaforquestionanswering to try out this model but I get errors.
from transformers import RobertaTokenizer, RobertaForQuestionAnswering
import torch
tokenizer = RobertaTokenizer.from_pretrained('roberta-base')
model = RobertaForQuestionAnswering.from_pretrained('roberta-base')
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
start_positions = torch.tensor([1])
end_positions = torch.tensor([3])
outputs = model(**inputs, start_positions=start_positions, end_positions=end_positions)
loss, start_scores, end_scores = outputs[:3]
The tokenizer is loaded but the error is in trying to load the QA model.
Traceback (most recent call last):
File â/gstore/home/madabhuc/.local/lib/python3.6/site-packages/transformers/modeling_utils.pyâ, line 655, in from_pretrained
raise EnvironmentError
OSError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File âtest.pyâ, line 7, in
model = RobertaForQuestionAnswering.from_pretrained(âroberta-baseâ)
File â/gstore/home/madabhuc/.local/lib/python3.6/site-packages/transformers/modeling_utils.pyâ, line 662, in from_pretrained
raise EnvironmentError(msg)
OSError: Canât load weights for âroberta-baseâ. Make sure that:
-
âroberta-baseâ is a correct model identifier listed on âhttps://huggingface.co/modelsâ
-
or âroberta-baseâ is the correct path to a directory containing a file named one of pytorch_model.bin, tf_model.h5, model.ckpt.