Trainer with fp8 - what to use in accel CLI vs. TrainingArguments

Hello there - was trying to do some training using the Trainer with fp8 - unlike with bf16 which is an argument in TrainingArguments, I do not see an option for fp8. What is the best way to pass this into Trainer so it appropriately uses fp8? Thank you so much for all your help!!

you can directly use the Accelerate Launcher with Trainer to use FP8 support out of the box.

Just do:

accelerate launch --mixed_precision fp8 training_script_using_trainer.py --kwarg1 value …

Reference:
Support H100 training with FP8 in Trainer and Deepspeed · Issue #25333 · huggingface/transformers (github.com)