Sending System and User Prompt in Deployed endpoint

I just deployed dolphin2.2-mistral using hugging face DEPLOY->INFERENCE ENDPOINT and got it working.

It does take inputs as payload and working fine. My issue is, what value should I send to have GPT like experience? Something like system, user and assistant.

How can I change the system prompt, and send chain of conversational messages that I can do like that of Openai gpt?

Payload like :

{
“inputs”: “What are different ways to copy with long stressful day?”,
“parameters”: {
“max_new_tokens”: 100
}
}

works.

But, how can I send conversational type messages?
Also, if I deploy a model, how can I change the code myself?

Sorry if this is 2 questions merged into one.

Lookup how to do custom endpoint handler which may help your use case. You can send whatever you need and then in the endpoint handler you process the input and pass it to the pipeline.

Do you have example of payload? I am pretty new and I am sure I am going to spend a lot of time on this which I am not sure I would be able to.

in the page I linked to above has quite a few examples listed in the linked repositories. Are you looking for something different?