Importing tokenizers version >0.10.3 fails due to openssl

So, the only tokenizer version I can install is 0.10.3 or lower. Any version higher (0.11 or higher) runs into an libssl error (specifically libssl.so.3 does not exist). As far as I understand this is related to openssl1.1.1 being installed instead of 3. However, some packages installed by hugging faces (i.e. tensorflow) require openssl1.1.1 or lower, meaning that I cannot install openssl3.

Anybody run into similar problems and has a solution?

1 Like

For me I solved the error by installing tokenizers using pip instead of conda