Optimum export ONNX failure

I would appreciate some help to explain me why it fails and what can I do.
Thank you!!

â–¶ optimum-cli export onnx --model codellama/CodeLlama-7b-Instruct-hf codellama-onnx
Framework not specified. Using pt to export to ONNX.
Downloading shards:   0%|                                                                                                                | 0/2 [16:38<?, ?it/s]
Loading TensorFlow model in PyTorch before exporting.                                                                   | 1.40G/9.98G [16:38<1:42:53, 1.39MB/s]
Traceback (most recent call last):
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/optimum/exporters/tasks.py", line 1708, in get_model_from_task
    model = model_class.from_pretrained(model_name_or_path, **kwargs)
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    return model_class.from_pretrained(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2869, in from_pretrained
    resolved_archive_file, sharded_metadata = get_checkpoint_shard_files(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/transformers/utils/hub.py", line 1040, in get_checkpoint_shard_files
    cached_filename = cached_file(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/transformers/utils/hub.py", line 429, in cached_file
    resolved_file = hf_hub_download(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1431, in hf_hub_download
    http_get(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 557, in http_get
    raise EnvironmentError(
OSError: Consistency check failed: file should be of size 9976701592 but has size 1402851118 ((…)of-00002.safetensors).
We are sorry for the inconvenience. Please retry download and pass `force_download=True, resume_download=False` as argument.
If the issue persists, please let us know by opening an issue on https://github.com/huggingface/huggingface_hub.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/stephane/.pyenv/versions/3.10.7/bin/optimum-cli", line 8, in <module>
    sys.exit(main())
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/optimum/commands/optimum_cli.py", line 163, in main
    service.run()
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/optimum/commands/export/onnx.py", line 232, in run
    main_export(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/optimum/exporters/onnx/__main__.py", line 323, in main_export
    model = TasksManager.get_model_from_task(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/optimum/exporters/tasks.py", line 1717, in get_model_from_task
    model = model_class.from_pretrained(model_name_or_path, **kwargs)
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    return model_class.from_pretrained(
  File "/Users/stephane/.pyenv/versions/3.10.7/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2839, in from_pretrained
    raise EnvironmentError(
OSError: codellama/CodeLlama-7b-Instruct-hf does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.
Downloading (…)of-00002.safetensors:  14%|██████████▉                                                                   | 1.40G/9.98G [16:39<1:41:46, 1.40MB/s]