from transformers import BertTokenizer
tokenizer = BertTokenizer(‘bert-baser-uncased’)
#1 tokens = tokenizer(texts, padding=True, truncation=True)
#2 tokens = tokenizer(texts, padding=“max_length”, …)
I was trying to find out what are the possible values for “padding” i.e. True, “max_length” …? I can’t seem to find it. I was expected to find it here
But no relevant documentation.
Please direct me to the documentation.