Hey @MoritzLaurer, the way I usually do this is by specifying the label2id
and id2label
dictionaries in the model’s config class, e.g. for a text classifier you can do this:
from transformers import AutoConfig, AutoModelForSequenceClassification
# define the mappings as dictionaries
label2id = ...
id2label = ...
# define model checkpoint - can be the same model that you already have on the hub
model_ckpt = ...
# define config
config = AutoConfig.from_pretrained(model_ckpt, label2id=label2id, id2label=id2label)
# load model with config
model = AutoModelForSequenceClassification.from_pretrained(model_ckpt, config=config)
# export model
model.save_pretrained(target_name_or_path)
Then you can push the files to the hub as usual. HTH!