Passing list of inputs to tokenize

In this tutorial (Processing the data - Hugging Face Course), they pass a collection of inputs into Tokenizer.tokenize(). But when I try to pass a list of texts, I get an error, and the source code (transformers/tokenization_utils.py at main 路 huggingface/transformers 路 GitHub) suggests it only accepts one input. So how do I tokenize lots of items at once as they seem to do in the tutorial?

Can you post your code?