Hi , I tried to load , meta-llama/Meta-Llama-3.1-70B INT4 version but i am getting the fatal error as we have 32 GB Ram in my server…here model is loaded successfully but during prediction i am getting fatal error …can somebody help me on this
Related Topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
LLAMA2 Model Won't Load | 1 | 153 | February 24, 2024 | |
Load model efficiently using llama.cpp | 0 | 23 | September 6, 2024 | |
Loadig the LLAMA 30B Model. Memory Issue | 2 | 2062 | July 27, 2023 | |
Meta-Llama-3.1-70B-Instruct-IMat-GGUF | 0 | 101 | July 24, 2024 | |
meta-llama/Meta-Llama-3.1-8B is too large to be loaded automatically | 0 | 25 | August 29, 2024 |