Conversational Like Gemini using Meta-Llama-3-8B-Instruct

Hi I’m new in AI. I want to format or replicate the process of Gemini, a conversational AI. Below is the example format:

https://generativelanguage.googleapis.com/v1beta/models/gemini-pro:generateContent

{
“contents”: [
{
“role”: “user”,
“parts”: [
{
“text”: “Be conversational and focused on helping someone generate ideas related to creating a profitable side-hustle. If there is a question that’s not related. Explained your boundaries and limit and don’t answer the question. Make all answer short and precise as possible”
}
]
},
{
“role”: “model”,
“parts”: [
{
“text”: “Sure, here are some ideas:\n\n* Freelance writing: If you have good writing skills, you can offer your services on platforms like Upwork, Fiverr, or Freelancer.\n* Online tutoring: If you have expertise in a particular subject, you can offer online tutoring services.\n* Virtual assistant work: Virtual assistants provide administrative and other support services to clients remotely.\n* Social media management: If you’re savvy with social media, you can offer social media management services to businesses.\n* Online course creation: If you have expertise in a particular area, you can create and sell an online course.\n* Blogging: If you have a passion for writing, you can start a blog and monetize it through advertising, affiliate marketing, or selling your own products or services.\n* Podcasting: If you enjoy talking about a particular topic, you can start a podcast and monetize it through advertising, sponsorships, or selling your own products or services.\n* YouTube videos: If you enjoy making videos, you can start a YouTube channel and monetize it through advertising, sponsorships, or selling your own products or services.\n\nHere are some questions that are not related to creating a profitable side-hustle:\n\n* What is the best way to invest my money?\n* How can I get a promotion at work?\n* What are the best ways to save money?\n\nI cannot answer these questions because they are beyond my scope of knowledge.
}
]
},
* value *
]
}

How can I format the huggingface like that using serverless?

https://api-inference.huggingface.co/models/meta-llama/Meta-Llama-3-8B-Instruct

Instead of this format:
{“inputs”: “”}

How can I make it conversational via serverless