Is there any way to produce skip-gram tokens with tokenizers library?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Create a simple tokenizer | 0 | 415 | February 14, 2023 | |
Tokenizers v0.8.0 is out! | 0 | 1508 | July 7, 2020 | |
Train Retry Tokenizer | 0 | 223 | April 18, 2023 | |
Tokenizer from own vocab | 0 | 450 | July 11, 2022 | |
Cannot create an identical PretrainedTokenizerFast object from a Tokenizer created by tokenizers library | 1 | 1080 | August 30, 2021 |