Chat UI With LLAMA 2

Hi there! I’m interested in the HuggingChat user interface available on GitHub. I’m curious about the process of incorporating my own custom model, such as Llama 2, since I prefer not to opt for the pro version. Is it feasible to self-host the model?

Yes that’s totally feasible!

Hugging Face leverages this to power HuggingChat, using TGI as backend and Chat UI as frontend. Refer to the README of Chat ui to use a custom model: GitHub - huggingface/chat-ui: Open source codebase powering the HuggingChat app.