I am new to Inference Endpoints and recently received an error when trying to initialize an endpoint with the MPT-7B model. For this model, I am receiving an error that “trust_remote_code=True” must be set for my model.
I was wondering if there was any way to get around this?
Below is a snippet of the logs. I am currently testing a number of different models to see which best fits my use case, so I will probably have some other threads going as well:) Thanks!
2f4wm 2023-05-26T19:21:03.773Z ValueError: Loading /repository requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.