Chat-ui with model from hub

I am trying to use my own fine-tuned model as a backend to chat-ui. I have tried plugging the model in the .env.local file, and I get a 500 from the server.

The only error the server throws in the log is

285f3b81efde02dc8b22928f0e038f7d51e7bf4822df3e2769643589b931"},“url”:“http://52.23.214.221:4173/",“params”:{},“request”:{},“error”:{“lineNumber”:9,“columnNumber”:0},“errorId”:"9b59bbe2-16ab-49cc-9604-146233915f57”}

Which is not super-helpful. Is this just something that can’t be done? Docs seem to indicate it is possible by plugging in the model specs in .env.local, but that is not working for me.

Any thoughts or guidance welcome. If I can provide any add’l info, please advise.

When I use the default mistral model all is fine.