Hi , I tried to load , meta-llama/Meta-Llama-3.1-70B INT4 version but i am getting the fatal error as we have 32 GB Ram in my server…here model is loaded successfully but during prediction i am getting fatal error …can somebody help me on this
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Error in loading Llama model | 0 | 48 | April 30, 2025 | |
| Meta-llama / Meta-Llama-3-70B-Instruct is not available as a serverless API | 10 | 1670 | September 28, 2024 | |
| LLama 70B not working | 1 | 1359 | August 8, 2023 | |
| I was using huugginfface meta-llama/Llama-2-7b-chat-hf and im facing an error | 2 | 2577 | October 8, 2023 | |
| LLAMA2 Model Won't Load | 1 | 285 | February 24, 2024 |