I’m developing a UI for easy inferencing of models. Right now it’s mainly a chatbot interface with markdown and websocket streaming, but I’ve got components in the works to support stable diffusion and offer control over parameters, prompts, and so on. Right now it runs vicuna and llama using pytorch or pyllama.cpp depending on your chosen mode, but I’m also working to add support for other models. (Given that it’s pytorch it likely works with many models, but I’ve not tested)
The primary goal is to build what I think is an appealing, responsive UI above all else, so that using cutting edge AI technology doesn’t feel like a trip down memory lane of my WinNT sysadmin days
I’m a front-end guy primarily so if anyone has ideas or would like to contribute directly to expanding this tool’s capabilities, please step forward!
demo here, but it’s hosted on my home server so… solid chance of more downtime than uptime. YMMV https://model.tanglebox.ai/