Hugging Face Forums
Can we use mixed precision with all? (fp16 + fp32 + bf16)
🤗Transformers
Indramal
December 1, 2022, 10:36am
1
Is it possible to use mixed precision with all?
All means use fp16 + fp32 + bf16 altogether?
Related topics
Topic
Replies
Views
Activity
Can I use fp16 model for mixed precision training?
🤗Transformers
0
296
January 16, 2024
Model pre-training precision database: fp16, fp32, bf16
🤗Transformers
4
7067
December 3, 2022
Training Arguments to do pure bf16 training?
🤗Transformers
0
2007
December 20, 2023
Mixed Precision training (fp16), how to use in production?
🤗Transformers
1
924
July 7, 2022
Explicitly disable bf16 for some layers
🤗Transformers
2
18
June 17, 2025