Hi @mahmutc ,
I am a student trying to learn the stuff. This snippet dint work for me for some reason.
Here is the error log I received:
Traceback (most recent call last):
File “C:\Users\SS\Desktop\Camp_langchain_models\2.ChatModels\2_chatmodel_hf_api.py”, line 9, in
print(llm.invoke(“What is the capital of Turkey”))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\langchain_core\language_models\llms.py”, line 387, in invoke
self.generate_prompt(
File “C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\langchain_core\language_models\llms.py”, line 764, in generate_prompt
return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\langchain_core\language_models\llms.py”, line 971, in generate
return self._generate_helper(
^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\langchain_core\language_models\llms.py”, line 790, in _generate_helper
self._generate(
File “C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\langchain_core\language_models\llms.py”, line 1545, in _generate
self._call(prompt, stop=stop, run_manager=run_manager, **kwargs)
File “C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\langchain_huggingface\llms\huggingface_endpoint.py”, line 312, in call
response_text = self.client.text_generation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\huggingface_hub\inference_client.py”, line 2297, in text_generation
provider_helper = get_provider_helper(self.provider, task=“text-generation”, model=model_id)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\SS\Desktop\Camp_langchain_models\venv\Lib\site-packages\huggingface_hub\inference_providers_init.py", line 169, in get_provider_helper
raise ValueError(
ValueError: Provider ‘featherless-ai’ not supported. Available values: ‘auto’ or any provider from
[‘black-forest-labs’, ‘cerebras’, ‘cohere’, ‘fal-ai’, ‘fireworks-ai’, ‘hf-inference’, ‘hyperbolic’, ‘nebius’, ‘novita’, ‘openai’, ‘replicate’, ‘sambanova’, ‘together’].Passing ‘auto’ (default value) will automatically select the first provider available for the model, sorted by the user’s order in Hugging Face – The AI community building the future..
I look forward to hearing from you.
Thanks,
SS