How to avoid `trust_remote_code=True` for my models

Hey , one of my orpo tuned model needs trust_remote_code=True , due to which i cannot submit to the leaderboards … what’s responsible for this and how can i omit this dependency … my model has 2 safe-tensors and no python script explicitly in the repo

1 Like

Hi,

The trust_remote_code=True flag is used for models whose code lives on the hub rather than natively in the Transformers library.

Models with that flag are created based on this guide: Building custom models

2 Likes

is there anyway i can fix it?
I used the same training script to train llama-orpo which is fine but for training phi3-orpo, it created this dependency.

For that, Phi-3 would need to be integrated natively in the Transformers library.

1 Like

Update, Phi-3 is now in Transformers as of version 4.40.

2 Likes

Hi I am currently using the Phi3 Vision Instruct, and I am getting the same issue. when I try to deploy it using flask/fastapi it is asking me to prompt yes. Now when the server auto restarts and the prompt is not given in the terminal, it closes the entire operation.

Anyway we can bypass it or do something before this model is moved natively into transformers

Hi @Archan,

Could you clarify why it is prompting you? I assume that if you deploy this model on your backend, you would have loaded the model already with trust_remote_code=True.

Also for deployment of Phi-3 Vision, you might be interested in the vLLM deployment: Supported Models — vLLM.