Error creating custom pre_tokenizer

thank you for the reply ,in this repo they are talking about using tokenizer i want train a tokenizer and i dont think there is method like

pre_tokenized=True

in train_from_iterator

1 Like