Hugging Face Forums
Can we use mixed precision with all? (fp16 + fp32 + bf16)
🤗Transformers
Indramal
December 1, 2022, 10:36am
1
Is it possible to use mixed precision with all?
All means use fp16 + fp32 + bf16 altogether?
Related topics
Topic
Replies
Views
Activity
Can I use fp16 model for mixed precision training?
🤗Transformers
0
261
January 16, 2024
Mixed Precision training (fp16), how to use in production?
🤗Transformers
1
849
July 7, 2022
Mixed precision for bfloat16-pretrained models
🤗Transformers
2
11658
April 21, 2021
Training Arguments to do pure bf16 training?
🤗Transformers
0
1413
December 20, 2023
Does it ever make sense to finetune w fp32 if the base model was trained w fp16?
Intermediate
1
692
July 8, 2022