Model card - parameters for inference

i tried using the inference parameter for my model card (b3ck1/gpt-neo-125M-finetuned-beer-recipes) and it worked for a while, but today i noticed a warning “Yaml error: inference must be a boolean” on the model page. So i tried to fix it but can not get the inference to work correctly anymore.

Has there been some change? Is the inference with parameters not allowed anymore on the model card?

Hi @b3ck1, thanks for raising this issue. We recently changed the way metadata of model cards is validated, and it’s possible we introduced a regression in this case, cc @coyotte508. We’re gonna look into it and come back to you shortly.

Hi @b3ck1 , the issue is fixed, you should be able to have your old model card without errors


@coyotte508 Yes, i can confirm that now the model card works without errors using the old parameters for inference.
Thanks for the fast fix!