Hugging Face Forums
How BitsAndBytesConfig use bnb_4bit_compute_dtype=torch.bfloat16 on gpu don't support torch.bfloat16
Beginners
mohamedemam
October 20, 2023, 11:05pm
1
is it convert to float16 when find doesn’t support bfloat16
Related Topics
Topic
Replies
Views
Activity
Confused with setting up torch_dtype while using CPU as device
🤗Transformers
0
1581
October 12, 2022
Does autogpt-q require float16?
🤗Transformers
0
301
August 28, 2023
Float16 on CPU torch support
Beginners
0
337
January 16, 2024
Deepspeed inference and infinity offload with bitsandbytes 4bit loaded models
DeepSpeed
2
2709
July 27, 2023
Diff between GPTQ and NF4 with bitsandbytes
🤗Transformers
0
1053
August 1, 2023