Hugging Face Forums
Explicitly disable bf16 for some layers
🤗Transformers
John6666
June 17, 2025, 4:47am
3
Or similar to this issue…?
show post in topic
Related topics
Topic
Replies
Views
Activity
Saving bf16 Model Weights When Using Accelerate+DeepSpeed
🤗Accelerate
4
422
March 17, 2025
Trainer option to disable saving DeepSpeed checkpoints
🤗Transformers
8
6525
May 23, 2023
Can I use fp16 model for mixed precision training?
🤗Transformers
0
296
January 16, 2024
What is the recommended way to do inference with low precision during training?
🤗Accelerate
1
1452
December 6, 2022
Question met when using DeepSpeed ZeRO3 AMP for code testing on simple pytorch examples
🤗Accelerate
0
32
July 24, 2024