when i use the map on Dataset object the 'input_ids return as list instead of tensors
def tokenize(batch):
return tokenizer(batch['text'],padding=True, return_tensors='pt', truncation=True).to(DEVICE)
data.map(tokenize, batched=False, batch_size=None)
→ return list
tokenizer(data['text'],padding=True, return_tensors='pt', truncation=True).to(DEVICE)
→ return tensor