Hello,
I am using an Apple M1 laptop and trying to follow along with the HuggingFace tutorial here:
When I run this program here:
from transformers import pipeline
translator = pipeline("translation", model="Helsinki-NLP/opus-mt-en-de")
translation = translator("hello, my name is Bob")
print(translation)
translator = pipeline("translation", model="Helsinki-NLP/opus-mt-fr-en")
translator("Ce cours est produit par Hugging Face.")
It is happy with the german translation model but produces this error over the french version
/Users/foo/src/tensorflowml/venv/bin/python "/Applications/PyCharm CE.app/Contents/plugins/python-ce/helpers/pydev/pydevd.py" --multiprocess --qt-support=auto --client 127.0.0.1 --port 54965 --file /Users/foo/src/tensorflowml/main.py
Connected to pydev debugger (build 221.5787.24)
Metal device set to: Apple M1 Pro
systemMemory: 16.00 GB
maxCacheSize: 5.33 GB
2022-07-30 14:32:36.965083: I tensorflow/core/common_runtime/pluggable_device/pluggable_device_factory.cc:305] Could not identify NUMA node of platform GPU ID 0, defaulting to 0. Your kernel may not have been built with NUMA support.
2022-07-30 14:32:36.965366: I tensorflow/core/common_runtime/pluggable_device/pluggable_device_factory.cc:271] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 0 MB memory) -> physical PluggableDevice (device: 0, name: METAL, pci bus id: <undefined>)
All model checkpoint layers were used when initializing TFMarianMTModel.
All the layers of TFMarianMTModel were initialized from the model checkpoint at Helsinki-NLP/opus-mt-en-de.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFMarianMTModel for predictions without further training.
[{'translation_text': 'Hallo, mein Name ist Bob'}]
Traceback (most recent call last):
File "/Applications/PyCharm CE.app/Contents/plugins/python-ce/helpers/pydev/pydevd.py", line 1491, in _exec
pydev_imports.execfile(file, globals, locals) # execute the script
File "/Applications/PyCharm CE.app/Contents/plugins/python-ce/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "/Users/foo/src/tensorflowml/main.py", line 18, in <module>
translator = pipeline("translation", model="Helsinki-NLP/opus-mt-fr-en")
File "/Users/foo/src/tensorflowml/venv/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 650, in pipeline
framework, model = infer_framework_load_model(
File "/Users/foo/src/tensorflowml/venv/lib/python3.8/site-packages/transformers/pipelines/base.py", line 266, in infer_framework_load_model
raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model Helsinki-NLP/opus-mt-fr-en with any of the following classes: (<class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForSeq2SeqLM'>, <class 'transformers.models.marian.modeling_tf_marian.TFMarianMTModel'>).
python-BaseException
Thanks for any clues
-jjones