Hello everyone, I’ve deployed an HF LLM, and I’ve set the Max Input Length (per Query) to 4,000. But my issue is that when I enter a long text that’s nowhere near 4000 tokens long, I get the error “NoneType is not subscriptable.” Any help would be appreciated. Thx.
Hi @ThatOneCoder
Glad to see you in the forms as well, could you share your code with us.
generally a None NoneType is not subscriptable
is a python error when you try to access a key in a variable example.
here’s a minimalistic reproducer :
a = None
a["hi"]
please verify your data pipeline, or post below any code that might help us out to identify the issue
Yooo wsg. It’s in this chainlit code in this@cl.on_message
part:
@cl.on_message
async def main(message: cl.Message):
search = DuckDuckGoSearchResults()
search_results = search.invoke(message.content)
message_history = cl.user_session.get("message_history")
message_history.append({"role": "user", "content": message.content})
message_history.append({"role": "system", "content": search_results})
msg = cl.Message(content="")
await msg.send()
stream = client.chat.completions.create(
model=ENDPOINT_URL,
messages=message_history,
temperature=0.50,
max_tokens=5000,
stream=True,
)
for part in stream:
if token := part.choices[0].delta.content or "":
await msg.stream_token(token)
message_history.append({"role": "assistant", "content": msg.content})
await msg.update()
Hi @ThatOneCoder
I have no clue about chainlit, but i would have to assume that the problem lies in this line
delete the 2 dots plus i think there is a problem when your code reaches this place
if token := part.choices[0].delta.content or "":
I have seen some problems when streaming results before and would recommend switching from streaming results to sending and API and awaiting for the full results.
to further check if this is the case for you too try the following code to print each part and let me know of your findings
(...)
for part in stream:
print("part = ", part)
if token == part.choices[0].delta.content or "":
(...)
to fix this i would say you use this code instead :
result = ""
try:
for message in client.chat_completion(
messages=[{"role": "user", "content": prompt}],
max_tokens=max_length,
stream=True,
):
result += message.choices[0].delta.content
except:
pass
print(result)
Thx, I was able to fix the problem. =) Now there’s a new issue. Similar to this as well, if I send a long message, the assistant ends up crashing. How can I fix that? I think it may be tied to the DDGS part of the code.
@ThatOneCoder I think the only way for this to get fixed is to upscale max_tokens
to something like len(prompt) + 1024
or any other number of your choice.
also do share with me your traceback, it does help me identify where exactly the issue occurred .
Didn’t work… Here is the full error:
Traceback (most recent call last):
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/chainlit/utils.py", line 44, in wrapper
return await user_function(**params_values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/chainlit/callbacks.py", line 118, in with_parent_id
await func(message)
File "/opt/render/project/src/Babbage-Plus/Babbage-V1-Plus.py", line 143, in main
search_results = search.invoke(message.content)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/langchain_core/tools/base.py", line 481, in invoke
return self.run(tool_input, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/langchain_core/tools/base.py", line 684, in run
raise error_to_raise
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/langchain_core/tools/base.py", line 653, in run
response = context.run(self._run, *tool_args, **tool_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/langchain_community/tools/ddg_search/tool.py", line 103, in _run
res = self.api_wrapper.results(query, self.max_results, source=self.backend)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/langchain_community/utilities/duckduckgo_search.py", line 127, in results
for r in self._ddgs_text(query, max_results=max_results)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/langchain_community/utilities/duckduckgo_search.py", line 64, in _ddgs_text
ddgs_gen = ddgs.text(
^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/duckduckgo_search/duckduckgo_search.py", line 240, in text
results = self._text_api(keywords, region, safesearch, timelimit, max_results)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/duckduckgo_search/duckduckgo_search.py", line 274, in _text_api
vqd = self._get_vqd(keywords)
^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/duckduckgo_search/duckduckgo_search.py", line 135, in _get_vqd
resp_content = self._get_url("POST", "https://duckduckgo.com", data={"q": keywords})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/render/project/src/.venv/lib/python3.11/site-packages/duckduckgo_search/duckduckgo_search.py", line 131, in _get_url
raise DuckDuckGoSearchException(f"{resp.url} return None. {params=} {content=} {data=}")
duckduckgo_search.exceptions.DuckDuckGoSearchException: https://duckduckgo.com/ return None. params=None content=None data={'q': "\u200eTitle: The Fascinating World of Penguins: Adaptations, Habitat, and Conservation Efforts\n\nIntroduction:\nPenguins are unique and captivating creatures that have captured the hearts of people around the world. These flightless birds, primarily found in the Southern Hemisphere, have adapted to life in the water and on land, making them well-suited to their icy habitats. In this essay, we will explore the various aspects of penguin biology, their unique adaptations, their habitat, and the conservation efforts needed to protect these fascinating creatures.\n\nPhysical Adaptations:\nPenguins have evolved several physical adaptations to help them survive in their harsh environments. Their wings have transformed into flippers, which enable them to swim efficiently through the water. Their bodies are stocky and streamlined, reducing drag as they swim. Additionally, their feathers are waterproof and tightly packed, providing insulation against the cold and a sleek exterior for swimming. Some species, like the Emperor penguin, even have a layer of blubber for further insulation.\n\nHabitat and Distribution:\nPenguins can be found primarily in the Southern Hemisphere, with the majority of species residing in Antarctica and the surrounding islands. They inhabit a variety of environments, from rocky shores and beaches to ice-covered seas. Each penguin species has its preferred habitat, with some, like the Galapagos penguin, living in tropical climates and others, like the Emperor penguin, thriving in the coldest regions of Antarctica.\n\nFeeding and Social Behavior:\nPenguins are carnivorous, primarily feeding on fish, squid, and krill. Their unique adaptations allow them to be excellent swimmers and divers, sometimes reaching depths of over 1,000 feet in search of food. Penguins are also highly social creatures, living in large colonies called rookeries. They exhibit complex social behaviors, including courtship rituals, cooperative breeding, and hierarchical structures within their colonies.\n\nConservation Challenges:\nDespite their adaptability, penguins face numerous conservation challenges. Climate change is a significant threat, as rising temperatures and melting ice affect their breeding grounds and food sources. Overfishing and pollution also impact the availability of their prey, while human encroachment on their habitats poses additional risks. Many penguin species are currently listed as vulnerable or endangered, making conservation efforts crucial for their survival.\n\nConclusion:\nPenguins are a testament to the wonders of evolution and adaptation. Their unique physical attributes and social behaviors make them a fascinating subject of study, and their captivating appearance continues to draw people to their icy habitats. However, the challenges they face due to climate change, overfishing, and human interference are real and pressing. It is our responsibility to protect these incredible creatures and their ecosystems, ensuring that future generations can continue to marvel at the world of penguins. By supporting conservation efforts and raising awareness about the importance of preserving their habitats, we can help save these unique birds and maintain the balance of our planet's delicate ecosystems."}
I used len(message.content) + 4500
instead of prompt
because you prolly meant that.
Are you crashing because you are passing too long strings to this library, or module, or function?
It looks like you’re passing the string itself.
duckduckgo_search.exceptions.DuckDuckGoSearchException:
Wdym by string? Srry my braincells js aren’t there rn
Strings, in programming, are generally a sequence of letters. It can also refer to a string quartet…
Anyway, the program appears to be dead at this point.
Perhaps this has not reached the generative AI-related processes…
search = DuckDuckGoSearchResults()
search_results = search.invoke(message.content) # maybe message.content too long
Oh, yea. The message.content
was the message that I sent in the UI and is what caused it to crash.
https://python.langchain.com/api_reference/community/tools/langchain_community.tools.ddg_search.tool.DuckDuckGoSearchResults.html#langchain_community.tools.ddg_search.tool.DuckDuckGoSearchResults.invoke
Maybe this function only lets DuckDuckGo do the search, so it can only pass strings that are long enough to let the search engine do the search.
If it’s longer than that, I think you have to split it up or cut it down somehow.
And since search engines accept a different number of characters depending on the content, not just the length of the string, I don’t think anyone knows what the clear maximum value is…?
No, I don’t think so. The system crashed because the string was too long, so modifying the code so that it can only take strings that are long wouldn’t work. Perhaps something that breaks the string up into smaller pieces instead.
Sorry, my translator was translating weird. That’s what I meant.
Ah, it’s all gud. Do you have an idea on how this could be implemented tho? Like some code or something?
I think the code below would be more concise if you simply cut it down.
However, it is also possible to summarize a long string of text by some AI and then pass it on. It’s a bit more difficult, though.
Sorry, I’m off to eat.
max_query = 50
search_results = search.invoke(message.content[:max_query])
Thx, I’ll check it out. Have a good meal G
I’m back. I can answer simple questions about programming in general.
As for LLM, you’re better off asking not-lain, nielsr, or someone else.
@ThatOneCoder
I wrote a blogpost about retrieval system using huggingface datasets and text generation in here
Give it a spin later and change the AI model implementation with your API one.
Thx. Btw who’s Rayner V. Giuret?