For some reason I get KeyError(‘inputs’) on any request to QA hosted inference like:
Thanks for checking
This is for @mfuntowicz
Thanks for reporting, the issue should be fixed now.
Sorry for the inconvenience
hi there, i am also getting this error message when i go to fine tune the pretrained BERTForMaskedLM on my dataset object.
a detailed view of my code and error message is available at the link below.
any idea what this error represents? is it in relation to my dataset, the trainer or perhaps model? thank you!!!