Persist Chat History for Gradio API

I am interacting with the Gradio API using a chatbot, but I want to continue the chat on another device like chatgpt history where it continues the chatting. How can I achieve this?


  const [chatHistory, setChatHistory] = useState([]);
  const [isLoading, setIsLoading] = useState(false);
  const [input, setInput] = useState("");
  const [previewMessage, setPreviewMessage] = useState("");
  const [clientt, setClientt] = useState(null);

  useEffect(() => {
    (async () => {
      const cli = await Client.connect("Qwen/Qwen2.5-VL-72B-Instruct");
      setClientt(cli);
    })();
  }, []);

  useEffect(() => {
    console.log('History:', JSON.stringify(chatHistory));
  }, [chatHistory])
  const sendMessage = async () => {
    if (!input.trim()) return;

    const newMessage = input;
    setInput('');
    setIsLoading(true);

    try {
      const a = await clientt.predict(
        "/add_text", {
        history: chatHistory,
        text: newMessage,
      });

      await clientt.predict(
        "/reset_user_input",
      );
      const data = await clientt.predict(
        "/predict", {
        _chatbot: [[newMessage, "null"]],
      });

      // console.log(JSON.stringify(data.data[0][0]));

      let botResponse = data.data[0][0] || "I couldn't understand that.";
      setChatHistory(prev => [...prev, botResponse]);
      // console.log('resp: ', JSON.stringify(data.data));
    } catch (error) {
      console.error("Error communicating with the model:", error);
      setChatHistory((prev) => [
        ...prev,
        [newMessage, "Sorry, there was an error processing your request."],
      ]);
    } finally {
      setIsLoading(false);
    }
  };

*clientt is not typo

1 Like

It’s a little old, but it’s similar to this.

No, thats not the solution. It just stores the chat but i want to make model continue chat on other device on previous topic. How can i achieve that?

1 Like

If you want to continue on other devices, there are several methods, but if you want to target multiple users, you will need to create a simple login system and database.

If you only want to target a single user, there is a way to store it in a global variable… but this is not clean, so we do not recommend it.

I will think for more users later, can you help me for one user??
How qwen 2.5 72B VL model continue chat on same topic on other device?
I tried:

      const cli = await Client.connect("Qwen/Qwen2.5-VL-72B-Instruct");
      localStorage.setItem("key", JSON.stringify(cli));
      const storedData = JSON.parse(localStorage.getItem("key"));
      console.log('sdata', storedData);
      setClientt(storedData);

but not works.

1 Like

If you want to do it using the existing space, there are almost no options…

However, I think there is a way to record all the history of your conversations with the chatbot in history, and then pass both message and history together.

There are several data structures for history, but basically you should prepare a structure like the one below. (In the case of Python. There should be something similar in JS as well.)

history = [
    {"role": "user", "content": "Hello."},
    {"role": "assistant", "content": "Hello."},
    {"role": "user", "content": "How are you?"},
    {"role": "assistant", "content": "Fine."},
]

Yes, but can’t find for JS

1 Like

How can I use Transformers.js for VL models?

1 Like

Maybe this.

Yes, but can’t find for JS

In the case of JavaScript, it seems that history is a two-dimensional list.

`import { Client } from "@gradio/client";

const client = await Client.connect("Qwen/Qwen2.5-72B-Instruct");
const result = await client.predict("/model_chat", { 		
		query: "Hello!!", 		
		history: [["Hello!",None]], 		
		system: "Hello!!", 
});

console.log(result.data);`

Doesn’t work :pensive_face:

It has these options:
const cli = await Client.connect(“Qwen/Qwen2.5-VL-72B-Instruct”, {
// Session continuity
eventstream:, // Enable event stream
events:,
auth:,
hf_token:,
headers:,
status_callback:,
with_null_state:,
});

anything useful?

1 Like

I think it’s for a different purpose. I wonder if what you’re aiming for is similar to this.

client.ts of gradio:

app_reference: string;
options: ClientOptions;

config: Config | undefined;
api_prefix = "";
api_info: ApiInfo<JsApiData> | undefined;
api_map: Record<string, number> = {};
session_hash: string = Math.random().toString(36).substring(2);
jwt: string | false = false;
last_status: Record<string, Status["stage"]> = {};

private cookies: string | null = null;

// streaming
stream_status = { open: false };
pending_stream_messages: Record<string, any[][]> = {};
pending_diff_streams: Record<string, any[][]> = {};
event_callbacks: Record<string, (data?: unknown) => Promise<void>> = {};
unclosed_events: Set<string> = new Set();
heartbeat_event: EventSource | null = null;
abort_controller: AbortController | null = null;
stream_instance: EventSource | null = null;
current_payload: any;
ws_map: Record<string, WebSocket | "failed"> = {};

example:

  console.log(cli.config);
  cli.close()

It just stores data in local storage, I want to continue the chat like chatgpt