I’m trying to export jais
using optimum-cli
, but I’m running into the following error:
ValueError: Trying to export a jais model, that is a custom or unsupported architecture for the task text-generation, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type jais to be supported natively in the ONNX export.
The documentation describes how to create a custom ONNX config, but not what should go in it. I can see examples of other model configs in optimum/exporters/onnx/model_configs.py
, but, as a total beginner, I haven’t been able to use those examples to extrapolate what I need for jais
. Finally, I don’t see any configuration information on the jais
homepage.
What do I need to put in the required config for jais
? More generally, why is the config required?
I don’t have any ML experience (I’m an AI compiler engineer), so I’m sure this is a silly question; but no one on my team was able to figure it out either. I appreciate any help!