Failed to Initialize MPT-7B endpoint due to 'trust_remote_code' Error

Hello!

I am new to Inference Endpoints and recently received an error when trying to initialize an endpoint with the MPT-7B model. For this model, I am receiving an error that “trust_remote_code=True” must be set for my model.

I was wondering if there was any way to get around this?

Below is a snippet of the logs. I am currently testing a number of different models to see which best fits my use case, so I will probably have some other threads going as well:) Thanks!

2f4wm 2023-05-26T19:21:03.773Z ValueError:
 Loading /repository requires you to execute the configuration file in that repo on your local machine.
 Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.

Hello @Nathan-Kowalski,

we are not yet having an automatic way to provide a value for trust_remote_code. For now you would need to add a custom handler.

1 Like

Hi @philschmid !Creating a custom handler just to pass a single boolean seems an extreme overkill. Is there any way to pass this flag as an environment variable? Is this on the roadmap anywhere? What is holding back this feature? Best

1 Like

It seems there is env.

From now on, TGI will not convert automatically pickle files without having --trust-remote-code flag or TRUST_REMOTE_CODE=true in the environment variables. This flag is already used for community defined inference code, and is therefore quite representative of the level of confidence you are giving the model providers.

How to avoid trust_remote_code=True for my models