Value error : sentencepiece


I have value error: This tokenizer cannot be instantiated. Please make sure you have sentencepiece installed in order to use this tokenizer.

I installed and updated sentencepiece (0.1.95) but still getting the same error. Can someone help?
Thank you!

Hi @Katarina, what happens if you try installing transformers in a new environment with

pip install transformers[sentencepiece]

Does that solve the problem?


Hi, thanks on answer…I tried with this but still the same error

Can you please share the code you are running and the full stack trace / error message?

Acctually it is working after I restarted my enviroment…thank you!

1 Like

This works:

pip install sentencepiece
1 Like

use this one-

pip install transformers[sentencepiece] datasets

this has to do with you shell actually. You are probably using zsh if you are on Mac, and you are getting zsh: no matches found: transformers[sentencepiece].

Solution: pip install "transformers[sentencepiece]"


Fix: pip install transformers[sentencepiece]
if it is installed and still getting error, restart the kernel


Kernel Restart solved this problem.

Thank you!

Use pip install transformers[sentencepiece]

And restart the kernel. It will work :slight_smile: