Hi, I want to run multiple models on the same batch, but it seems each model has a different tokenizer. Is it possible to tokenize a batch?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Tokenize iterable dataset | 0 | 263 | June 7, 2023 | |
Help defining tokenizer | 0 | 283 | April 28, 2023 | |
Train Retry Tokenizer | 0 | 223 | April 18, 2023 | |
Different sentiments when texts processed in batches vs singles | 1 | 447 | July 3, 2022 | |
Batch tokenize (split into tokens, without processing) | 4 | 761 | October 28, 2023 |