Hello,
I want to deploy a custom inference endpoint that needs a binary file before loading the SD pipeline. What is the recommended way to do this? I was thinking to use a custom container that has the file baked in. Is that the right approach? Please advise
Thanks