Having problem running qwen2:7b locally. Getting 'AgentGenerationError: Error in generating model output'

Im trying to run the code from the chapter Building Agents That Use Code - Hugging Face Agents Course . After exceeding the inference limit , I have installed ollama on my machine and pulled the quen2:7b model and checked that ollama is running on the default url successfully.

Here’s my code :

!pip install smolagents -U
!pip install duckduckgo-search
!pip install 'smolagents[litellm]'
from smolagents import CodeAgent, DuckDuckGoSearchTool, LiteLLMModel
from smolagents import LiteLLMModel

model = LiteLLMModel(
        model_id="ollama_chat/qwen2:7b",  
        api_base="http://127.0.0.1:11434", 
        num_ctx=8192,
    )
agent = CodeAgent(tools=[DuckDuckGoSearchTool()], model=LiteLLMModel())
agent.run("Search for the best music recommendations for a party at the Wayne's mansion.")

Error logs :

Error in generating model output:
litellm.AuthenticationError: Missing Anthropic API Key - A call is being made to anthropic but no key is set either
in the environment variables or via params. Please set ANTHROPIC_API_KEY in your environment vars

[Step 1: Duration 0.02 seconds]

---------------------------------------------------------------------------

AuthenticationError                       Traceback (most recent call last)

/usr/local/lib/python3.11/dist-packages/smolagents/agents.py in _step_stream(self, memory_step)
   1610             else:
-> 1611                 chat_message: ChatMessage = self.model.generate(
   1612                     input_messages,

12 frames

AuthenticationError: litellm.AuthenticationError: Missing Anthropic API Key - A call is being made to anthropic but no key is set either in the environment variables or via params. Please set ANTHROPIC_API_KEY in your environment vars
1 Like

Hmm… Maybe you need ollama_chat/ prefix.

2 Likes