Trying to convert DeepSeek-R1 into onnx

I’m trying to convert DeepSeek-R1 into a onnx format, but i’m being presented with

ValueError: Loading deepseek-ai/DeepSeek-R1 requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error.

I’m trying to do this using optimum-cli

optimum-cli export onnx --model deepseek-ai/DeepSeek-R1 --task causal-lm C:\DeepSeek-R1-Onnx

Can i somehow enable this using cli, or do i have to manually download the model into my system and using cli i would have to perform onnx instead of repo link

if yes, then how can i enable trust_remote_code=True once i download the repo?

1 Like

If you are running the downloaded file, you have already trusted the code, so trust_remote_code is not necessary.

However, if the problem is that you are using an old version of Transformers or Optimum, as shown below, consider using the development version from github. In some cases, this may be supported.

It just means that this model type hasn’t been supported by the official transformers library yet. You can look into the python files in this repo if you really want to make sure it is safe.