AutoTrain Error DeepSpeed Zero-3

I am trying to finetune Mixtral 8x7b using Hugging Face AutoTrain.

I get this error message. Can you help me?

Generating train split: 0 examples [00:00, ? examples/s]
Generating train split: 44 examples [00:00, 830.23 examples/s]
ERROR | 2024-08-21 09:19:18 | autotrain.trainers.common:wrapper:120 - train has failed due to an exception: Traceback (most recent call last):
File “/app/env/lib/python3.10/site-packages/autotrain/trainers/common.py”, line 117, in wrapper
return func(*args, **kwargs)
File “/app/env/lib/python3.10/site-packages/autotrain/trainers/clm/main.py”, line 28, in train
train_sft(config)
File “/app/env/lib/python3.10/site-packages/autotrain/trainers/clm/train_clm_sft.py”, line 46, in train
trainer = SFTTrainer(
File “/app/env/lib/python3.10/site-packages/huggingface_hub/utils/_deprecation.py”, line 101, in inner_f
return f(*args, **kwargs)
File “/app/env/lib/python3.10/site-packages/trl/trainer/sft_trainer.py”, line 413, in init
super().init(
File “/app/env/lib/python3.10/site-packages/transformers/trainer.py”, line 444, in init
raise ValueError(
ValueError: Model was not initialized with Zero-3 despite being configured for DeepSpeed Zero-3. Please re-initialize your model via Model.from_pretrained(...) or Model.from_config(...) after creating your TrainingArguments!

ERROR | 2024-08-21 09:19:18 | autotrain.trainers.common:wrapper:121 - Model was not initialized with Zero-3 despite being configured for DeepSpeed Zero-3. Please re-initialize your model via Model.from_pretrained(...) or Model.from_config(...) after creating your TrainingArguments!
INFO | 2024-08-21 09:19:18 | autotrain.trainers.common:pause_space:77 - Pausing space…

it seems similar to Running on Multiple GPU with DeepSpeed. Error: Model was not initialized with Zero-3 despite being configured for Deepspeed Zero-3. Please re-initialize your model via Model.from_pretrained or Model.from_config after creating your TrainingArguments! · Issue #32901 · huggingface/transformers · GitHub.