ONNX Conversion - transformers.onnx vs convert_graph_to_onnx.py

Hi @pierreguillou ,

Just wondering if you know how to use the new one now? I’m trying to export a Bert model with this Code:

from transformers import AutoTokenizer
from pathlib import Path
from transformers.onnx.convert import export
from transformers.models.bert import BertConfig, BertOnnxConfig

path = Path("/Volumes/workplace/upload_content/onnx/all-MiniLM-L6-v2.onnx")
config = BertConfig()
onnx_config = BertOnnxConfig(config)

tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/all-MiniLM-L6-v2")

export(
    preprocessor=tokenizer,
    model="sentence-transformers/all-MiniLM-L6-v2",
    output=path,
    config=onnx_config,
    opset=11,
)

But it doesn’t perform anything. Not sure how can I make this work. I wondering why the newer version is much complicated than the older one…