I have value error: This tokenizer cannot be instantiated. Please make sure you have
sentencepiece installed in order to use this tokenizer.
I installed and updated
sentencepiece (0.1.95) but still getting the same error. Can someone help?
Hi @Katarina, what happens if you try installing
transformers in a new environment with
pip install transformers[sentencepiece]
Does that solve the problem?
Hi, thanks on answer…I tried with this but still the same error
Can you please share the code you are running and the full stack trace / error message?
Acctually it is working after I restarted my enviroment…thank you!
pip install sentencepiece
use this one-
pip install transformers[sentencepiece] datasets