Hugging Face Forums
Can we use mixed precision with all? (fp16 + fp32 + bf16)
🤗Transformers
Indramal
December 1, 2022, 10:36am
1
Is it possible to use mixed precision with all?
All means use fp16 + fp32 + bf16 altogether?
Related Topics
Topic
Replies
Views
Activity
Does fp16 training compromise accuracy?
Models
2
910
May 17, 2022
What's a good value for pad_to_multiple_of?
🤗Transformers
3
4600
August 29, 2023
ValueError: Mixed precision training with AMP or APEX (`--fp16` or `--bf16`) and half precision evaluation (`--fp16_full_eval` or `--bf16_full_eval`) can only be used on CUDA devices
🤗Transformers
0
1737
May 17, 2022
Mt5 fine-tuning using fp16 yields zero loss
🤗Transformers
1
553
April 23, 2023
How to enable BF16 on tpus?
🤗Accelerate
4
1664
August 11, 2022