I have a model that I’m using through an inference endpoint. When using it locally, I implemented a stopping criteria.
Is there a way to embed the stopping criteria in the model deployed through the inference endpoint ?
Thanks
I have a model that I’m using through an inference endpoint. When using it locally, I implemented a stopping criteria.
Is there a way to embed the stopping criteria in the model deployed through the inference endpoint ?
Thanks