I’m using autotrain to finetune an already fine-tuned Question Answering model ‘BramVanroy/GEITje-7B-ULTRA’, which is based on Mistral 7B and further pretrained on Dutch data.
I have some context specific question-answer pairs I’d like to further finetune the model with. However, when using a simple autotrain space with the following settings (see picture), I get the following error:
aautotrain.trainers.common:wrapper:121 - Unrecognized configuration class <class ‘transformers.models.mistral.configuration_mistral.MistralConfig’> for this kind of AutoModel: AutoModelForSeq2SeqLM.
Model type should be one of BartConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, EncoderDecoderConfig, FSMTConfig, GPTSanJapaneseConfig, LEDConfig, LongT5Config, M2M100Config, MarianConfig, MBartConfig, MT5Config, MvpConfig, NllbMoeConfig, PegasusConfig, PegasusXConfig, PLBartConfig, ProphetNetConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SwitchTransformersConfig, T5Config, UMT5Config, XLMProphetNetConfig…
What could I be doing wrong?