Cannot initialize deberta-v3-base tokenizer

when calling

tokenizer = AutoTokenizer.from_pretrained("microsoft/deberta-v3-base")

I get a ValueError: This tokenizer cannot be instantiated. Please make sure you have sentencepiece installed in order to use this tokenizer.
Tried conda install -c conda-forge transformers[sentencepiece] as well as conda install -c conda-forge sentencepiece to no result

1 Like

Running pip install sentencepiece and restarting the kernel should do the trick. :wink:

1 Like

Thank you!