Huggingface for Production


I have a technical assignment that requires me to deploy a LLM mode and host it in an isolated environment, such that I have full control of the data traffic inside the deployment and no malicious code can get activated inside the deployment and send data outside the deployment.
The model will process user data, and my organization shall have full control on the storage data and purge it when possible.
Is huggingface space a suitable place to host a model such as mistral 7b instruct and wrap it with small api that returns a text generation ?
is huggingface only for experiments or it’s also designed for production hosting and big companies depend on it?
Please advise.

1 Like