Hugging Face Forums
Fine-tune OPT 13B: CUDA out of memory error (720gb vram, batch size 1, fp16)!
Beginners
anujn
June 24, 2022, 4:23pm
6
Worked like a charm! Thank you
@sgugger
you rock!!!
1 Like
Finetune LLM with DeepSpeed
show post in topic
Related topics
Topic
Replies
Views
Activity
OPT Memory problem
Beginners
2
811
June 2, 2022
Finetune LLM with DeepSpeed
DeepSpeed
2
5027
February 22, 2024
Fine-tuning a 16B CodeGen model with 256GB RAM+2xA6000s?
DeepSpeed
2
1631
July 3, 2023
RuntimeError: CUDA out of memory. Tried to allocate 1.91 GiB (GPU 0; 15.78 GiB total capacity; 12.36 GiB already allocated; 302.75 MiB free; 14.16 GiB reserved in total by PyTorch)
Beginners
2
1306
September 11, 2021
Training llama2-13b-16k model with peft on 3 A100 of 80GB is still throwing cuda out of memory
🤗Accelerate
0
789
October 16, 2023