Hi, I’m new and ignore too much. I’m trying to run in collab “Building Agents That Use Code”. WHen running the third cell (after sign in):
from smolagents import CodeAgent, DuckDuckGoSearchTool, InferenceClientModel
agent = CodeAgent(tools=[DuckDuckGoSearchTool()], model=InferenceClientModel())
agent.run(“Search for the best music recommendations for a party at the Wayne’s mansion.”)
I get the following error:
Error in generating model output:
Provider ‘featherless-ai’ not supported. Available values: ‘auto’ or any provider from [‘black-forest-labs’,
‘cerebras’, ‘cohere’, ‘fal-ai’, ‘fireworks-ai’, ‘hf-inference’, ‘hyperbolic’, ‘nebius’, ‘novita’, ‘nscale’,
‘openai’, ‘replicate’, ‘sambanova’, ‘together’].Passing ‘auto’ (default value) will automatically select the first
provider available for the model, sorted by the user’s order in Hugging Face – The AI community building the future. .
1 Like
I think you probably forgot to run pip install
in the first cell, causing the huggingface_hub
library to be outdated, and also because of a specification change in smolagents
. It’s probably a library version issue.
from smolagents import CodeAgent, WebSearchTool, InferenceClientModel
agent = CodeAgent(tools=[WebSearchTool()], model=InferenceClientModel())
agent.run("Search for the best music recommendations for a party at the Wayne's mansion.")
opened 09:16PM - 12 Jun 25 UTC
hands-on-bug
**Describe the bug**
When running the code in [Building Agents That Use Code](ht… tps://huggingface.co/learn/agents-course/unit2/smolagents/code_agents), the DuckDuckGo search tool is no longer included in the default smolagents package.
**To Reproduce**
Run the code in the [notebook](https://huggingface.co/agents-course/notebooks/blob/main/unit2/smolagents/code_agents.ipynb). The failure happens in the third code block.
**Additional context**
It appears that the DuckDuckGoSearchTool was removed as a required dependency in smolagents [v1.15.0](https://github.com/huggingface/smolagents/releases/tag/v1.15.0), and was replaced in the documentation with WebSearchTool.
Remember to update the course pages and not just the notebook.
opened 09:58AM - 09 May 25 UTC
closed 03:45PM - 12 May 25 UTC
bug
### Describe the bug
It looks like `GET /api/models/...?expand=inferenceProvide… rMapping` is returning a still unsupported inference provider.
https://github.com/huggingface/huggingface_hub/blob/785835fc58defeff40368663589cca0a9a8808da/src/huggingface_hub/inference/_providers/__init__.py#L52-L131
Would it be more convenient to skip unsupported providers instead of raising an exception?
### Reproduction
#### `requirements.txt`
```console
huggingface_hub==0.31.1
```
#### `example.py`
```python
from huggingface_hub import InferenceClient
InferenceClient("meta-llama/Llama-3.1-70B-Instruct").chat_completion(messages=[])
```
### Logs
#### `python example.py`
```python
Traceback (most recent call last):
File "/.../example.py", line 13, in <module>
InferenceClient("meta-llama/Llama-3.1-70B-Instruct").chat_completion(messages=[])
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
File "/.../.venv/lib/python3.13/site-packages/huggingface_hub/inference/_client.py", line 886, in chat_completion
provider_helper = get_provider_helper(
self.provider,
...<3 lines>...
else payload_model,
)
File "/.../.venv/lib/python3.13/site-packages/huggingface_hub/inference/_providers/__init__.py", line 169, in get_provider_helper
raise ValueError(
...<3 lines>...
)
ValueError: Provider 'featherless-ai' not supported. Available values: 'auto' or any provider from ['black-forest-labs', 'cerebras', 'cohere', 'fal-ai', 'fireworks-ai', 'hf-inference', 'hyperbolic', 'nebius', 'novita', 'openai', 'replicate', 'sambanova', 'together'].Passing 'auto' (default value) will automatically select the first provider available for the model, sorted by the user's order in https://hf.co/settings/inference-providers.
```
#### <code>GET <a href="https://huggingface.co/api/models/meta-llama/Llama-3.1-70B-Instruct?expand=inferenceProviderMapping">https://huggingface.co/api/models/meta-llama/Llama-3.1-70B-Instruct?expand=inferenceProviderMapping</a></code>
```json
{
"_id": "66969ad27a033bf62173f3e2",
"id": "meta-llama/Llama-3.1-70B-Instruct",
"inferenceProviderMapping": {
"featherless-ai": {
"status": "live",
"providerId": "meta-llama/Meta-Llama-3.1-70B-Instruct",
"task": "conversational"
},
"novita": {
"status": "live",
"providerId": "meta-llama/llama-3.1-70b-instruct",
"task": "conversational"
},
"nebius": {
"status": "live",
"providerId": "meta-llama/Meta-Llama-3.1-70B-Instruct-fast",
"task": "conversational"
},
"hyperbolic": {
"status": "staging",
"providerId": "meta-llama/Meta-Llama-3.1-70B-Instruct",
"task": "conversational"
}
}
}
```
### System info
```console
- huggingface_hub version: 0.31.0
- Platform: macOS-15.4.1-arm64-arm-64bit-Mach-O
- Python version: 3.13.3
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Running in Google Colab Enterprise ?: No
- Token path ?: /.../.cache/huggingface/token
- Has saved token ?: False
- Configured git credential helpers: osxkeychain
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.7.0
- Jinja2: 3.1.6
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 11.2.1
- hf_transfer: 0.1.9
- gradio: N/A
- tensorboard: N/A
- numpy: 2.2.5
- pydantic: 2.10.6
- aiohttp: 3.11.18
- hf_xet: 1.1.0
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /.../.cache/huggingface/hub
- HF_ASSETS_CACHE: /.../.cache/huggingface/assets
- HF_TOKEN_PATH: /.../.cache/huggingface/token
- HF_STORED_TOKENS_PATH: /.../.cache/huggingface/stored_tokens
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
```
Hmm…? featherless-ai
seems just supported now.
!pip install -U huggingface_hub[hf_xet] smolagents
this similar error was fixed by some of the HF team: “Very sorry about the inconvenience, we fixed this issue server-side, it should be all good now!
I’m closing this issue (and the PR) but let us know if you see any other unexpected behavior.”
But no instructions are provided (Or that I can follow)
1 Like
Now it works, with the line
“!pip install duckduckgo-search”.
The error changed before this works.
Now I have the error that I saw in others posts, about payment required.
Thanks for your help.
1 Like