I want to create an inference endpoint for Salesforce/codegen-16B-mono
, but the config file for that model contains this:
"task_specific_params": {
"text-generation": {
"do_sample": true,
"max_length": 50,
"temperature": 1.0
}
}
I want to remove the max_length and temperature. What is the best way to do this?