Could not load model Helsinki-NLP/opus-mt-fr-en


I am using an Apple M1 laptop and trying to follow along with the HuggingFace tutorial here:

When I run this program here:

from transformers import pipeline

translator = pipeline("translation", model="Helsinki-NLP/opus-mt-en-de")
translation = translator("hello, my name is Bob")

translator = pipeline("translation", model="Helsinki-NLP/opus-mt-fr-en")
translator("Ce cours est produit par Hugging Face.")

It is happy with the german translation model but produces this error over the french version

/Users/foo/src/tensorflowml/venv/bin/python "/Applications/PyCharm" --multiprocess --qt-support=auto --client --port 54965 --file /Users/foo/src/tensorflowml/
Connected to pydev debugger (build 221.5787.24)
Metal device set to: Apple M1 Pro
systemMemory: 16.00 GB
maxCacheSize: 5.33 GB
2022-07-30 14:32:36.965083: I tensorflow/core/common_runtime/pluggable_device/] Could not identify NUMA node of platform GPU ID 0, defaulting to 0. Your kernel may not have been built with NUMA support.
2022-07-30 14:32:36.965366: I tensorflow/core/common_runtime/pluggable_device/] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 0 MB memory) -> physical PluggableDevice (device: 0, name: METAL, pci bus id: <undefined>)
All model checkpoint layers were used when initializing TFMarianMTModel.
All the layers of TFMarianMTModel were initialized from the model checkpoint at Helsinki-NLP/opus-mt-en-de.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFMarianMTModel for predictions without further training.
[{'translation_text': 'Hallo, mein Name ist Bob'}]
Traceback (most recent call last):
  File "/Applications/PyCharm", line 1491, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "/Applications/PyCharm", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "/Users/foo/src/tensorflowml/", line 18, in <module>
    translator = pipeline("translation", model="Helsinki-NLP/opus-mt-fr-en")
  File "/Users/foo/src/tensorflowml/venv/lib/python3.8/site-packages/transformers/pipelines/", line 650, in pipeline
    framework, model = infer_framework_load_model(
  File "/Users/foo/src/tensorflowml/venv/lib/python3.8/site-packages/transformers/pipelines/", line 266, in infer_framework_load_model
    raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model Helsinki-NLP/opus-mt-fr-en with any of the following classes: (<class ''>, <class 'transformers.models.marian.modeling_tf_marian.TFMarianMTModel'>).

Thanks for any clues