Asking a very basic question but any suggestions would be highly appreciated, I am not able to see the predictions for this code from huggingface longformer piece of code, this is not related to longformer but a generic question:
import torch
from transformers import LongformerForMaskedLM, LongformerTokenizermodel = LongformerForMaskedLM.from_pretrained(‘allenai/longformer-base-4096’, return_dict=True)
tokenizer = LongformerTokenizer.from_pretrained(‘allenai/longformer-base-4096’)SAMPLE_TEXT = ’ '.join(['Hello world! '] * 1000) # long input document
input_ids = torch.tensor(tokenizer.encode(SAMPLE_TEXT)).unsqueeze(0) # batch of size 1attention_mask = None # default is local attention everywhere, which is a good choice for MaskedLM
… # checkLongformerModel.forward
for more details how to setattention_mask
outputs = model(input_ids, attention_mask=attention_mask, labels=input_ids)
loss = outputs.loss
prediction_logits = output.logits