Hi , I tried to load , meta-llama/Meta-Llama-3.1-70B INT4 version but i am getting the fatal error as we have 32 GB Ram in my server…here model is loaded successfully but during prediction i am getting fatal error …can somebody help me on this
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
LLAMA2 Model Won't Load | 1 | 234 | February 24, 2024 | |
Error in loading Llama model | 0 | 12 | April 30, 2025 | |
Load model efficiently using llama.cpp | 0 | 182 | September 6, 2024 | |
Need Suggestions for LLM Models Suitable for 250GB RAM Server | 0 | 144 | December 29, 2024 | |
Loadig the LLAMA 30B Model. Memory Issue | 2 | 2156 | July 27, 2023 |