when calling
tokenizer = AutoTokenizer.from_pretrained("microsoft/deberta-v3-base")
I get a ValueError: This tokenizer cannot be instantiated. Please make sure you have
sentencepiece installed in order to use this tokenizer.
Tried conda install -c conda-forge transformers[sentencepiece]
as well as conda install -c conda-forge sentencepiece
to no result