ValueError: too many values to unpack (expected 2) when using BertTokenizer

Thank you very much, I was about to write the same!
I have one more question, hope that is ok. How can I tokenize more than one prompt it’s choices? Namely, I have a batch of prompts and each one has it’s three choices. Cannot figure how to tokenize it. Should I use

encoding = tokenizer([prompt, prompt, prompt], [choice0, choice1, choice2], return_tensors='tf', padding=True)
inputs = {k: tf.expand_dims(v, 0) for k, v in encoding.items()}

for each prompt and it’s choices one at a time? Or there is a way to tokenize them in one batch?

Thanks again,
Ayala