Getting the error: "ValueError: The following model_kwargs are not used by the model:....."

I needed some help with this error I am facing. So I am basically re-implementing a particular research paper’s code, available on their github page with the link:- GitHub - XiangLi1999/ContrastiveDecoding: contrastive decoding

When I run the command given as it is in the repo using the run_generation.py with the proper environment with all necessary libraries installed, I get the following error:-

valueError: The following model_kwargs are not used by the model: [‘min_prob’, ‘student_lm’, ‘teacher_student’, ‘model_kwargs_student’, ‘st_coef’, ‘tokenizer’, ‘student_min_prob’, ‘student_temperature’, ‘use_cap_student’, ‘use_switch’] (note: typos in the generate arguments will also show up in this list)

I found somewhere online that if I downgraded my transformers version from the current latest one to 4.21.0 it would work and it did, but in that case I am unable to then implement the code using 2 new transformer models like Mistral or LLaMa2. I wanted to know how to work around this issue and resolve it. My main query is how can the code be altered in the run_generation.py (from the repo) file so that it works for the latest version of transformers.

I need some help urgently for this, and would appreciate any input regarding this matter from anyone. Thanks.

cc @joaogante

Hi @Pranav0511 :wave:

My suggestion would be to comment the line that performs that check, i.e. this one.

The check exists to confirm whether all arguments are consumed, and is built with the original codebase in mind :slight_smile: