Hugging Face Forums
Explicitly disable bf16 for some layers
🤗Transformers
John6666
June 17, 2025, 4:47am
3
Or similar to this issue…?
show post in topic
Related topics
Topic
Replies
Views
Activity
Saving bf16 Model Weights When Using Accelerate+DeepSpeed
🤗Accelerate
4
360
March 17, 2025
Setting Different Precisions for Different Modules During Mixed Precision Training
Beginners
0
126
May 30, 2024
Question met when using DeepSpeed ZeRO3 AMP for code testing on simple pytorch examples
🤗Accelerate
0
30
July 24, 2024
Impossible to train a model using both bf16 mixed precision training and torch compile, RuntimeError: expected mat1 and mat2 to have the same dtype
🤗Transformers
8
1771
October 28, 2024
Model pre-training precision database: fp16, fp32, bf16
🤗Transformers
4
7022
December 3, 2022