Dear community,
I had some working code and then it stopped working recently.
I don’t know what caused it but it was initially working in Google Colab.
It basically uses Gradio Chatbot UI and passes queries to OpenAI using LangChain.
I used a global variable called chat_history to hold the conversation. It was working fine but all of sudden now it is a problem.
I can only ask the first question and always get the same error on the second.
ValueError: Unsupported chat history format: <class ‘list’>. Full chat history: [[‘Hi’, ‘Hello! How can I assist you today?’]]
It seems don’t like when I pass the previous history back to the variable and query the model on the second question.
What did I do wrong?
# Initialise Langchain - Conversation Retrieval Chain
qa = ConversationalRetrievalChain.from_llm(ChatOpenAI(temperature=0), vectorstore.as_retriever())
# Front end web app
import gradio as gr
# Define an empty chat_history list
chat_history = []
with gr.Blocks() as demo:
chatbot = gr.Chatbot()
msg = gr.Textbox()
clear = gr.Button("Clear")
def user(user_message, chat_history):
# Get response from QA chain
response = qa({"question": user_message, "chat_history": chat_history})
# Append user message and response to chat history
chat_history = [(user_message, response["answer"])]
return gr.update(value=""), chat_history
msg.submit(user, [msg, chatbot], [msg, chatbot], queue=False)
clear.click(lambda: None, None, chatbot, queue=False)
if __name__ == "__main__":
demo.launch(debug=True)
Thanks,