I keep getting the following error message.
from transformers import AutoTokenizer
File "/home/peter/.local/lib/python3.10/site-packages/transformers/__init__.py", line 30, in <module>
from . import dependency_versions_check
File "/home/peter/.local/lib/python3.10/site-packages/transformers/dependency_versions_check.py", line 41, in <module>
require_version_core(deps[pkg])
File "/home/peter/.local/lib/python3.10/site-packages/transformers/utils/versions.py", line 122, in require_version_core
return require_version(requirement, hint)
File "/home/peter/.local/lib/python3.10/site-packages/transformers/utils/versions.py", line 116, in require_version
_compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
File "/home/peter/.local/lib/python3.10/site-packages/transformers/utils/versions.py", line 49, in _compare_versions
raise ImportError(
ImportError: tokenizers>=0.11.1,!=0.11.3,<0.13 is required for a normal functioning of this module, but found tokenizers==0.19.1.
Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git main
I’ve tried reinstalling both transformers and tokenizers, but this doesn’t help. It seems strange that, given that both transformers and tokenizers come from HuggingFace, one should be incompatible with the other. Can anyone tell me what’s wrong and how to fix it?