Tokenize a batch of data

Hi, I want to run multiple models on the same batch, but it seems each model has a different tokenizer. Is it possible to tokenize a batch?