>>> from transformers import BartTokenizerFast
>>> tokenizer = BartTokenizerFast.from_pretrained("facebook/bart-large")
>>> str = "How are you?"
>>> tokenizer(str, return_tensors="pt")
{'input_ids': tensor([[ 0, 6179, 32, 47, 116, 2]]), 'attention_mask': tensor([[1, 1, 1, 1, 1, 1]])}
>>> tokenizer(str, padding=True, max_length=10, return_tensors="pt")
{'input_ids': tensor([[ 0, 6179, 32, 47, 116, 2]]), 'attention_mask': tensor([[1, 1, 1, 1, 1, 1]])}
Why didn’t it show something like
{'input_ids': tensor([[ 0, 6179, 32, 47, 116, 2, 1, 1, 1, 1]]), 'attention_mask': tensor([[1, 1, 1, 1, 1, 1, 0, 0, 0, 0]])}
and how can I do it?