Hi , I tried to load , meta-llama/Meta-Llama-3.1-70B INT4 version but i am getting the fatal error as we have 32 GB Ram in my server…here model is loaded successfully but during prediction i am getting fatal error …can somebody help me on this
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Error in loading Llama model | 0 | 25 | April 30, 2025 | |
Meta-llama / Meta-Llama-3-70B-Instruct is not available as a serverless API | 10 | 1615 | September 28, 2024 | |
meta-llama/Meta-Llama-3-8B-Instruct Error invoke: 500 Server Error | 0 | 24 | August 29, 2024 | |
Could not load model meta-llama/Llama-2-7b-chat-hf with any of the following classes | 22 | 49766 | December 19, 2024 | |
LLAMA2 Model Won't Load | 1 | 259 | February 24, 2024 |