Error with new tokenizers (URGENT!)

I resolved it.

  1. Uninstalled transformers
  2. Installed transformers sentencepiece like this : !pip install --no-cache-dir transformers sentencepiece
  3. Use_fast= False like this: tokenizer = AutoTokenizer.from_pretrained(“XXXXX”, use_fast=False)
22 Likes