I am completely new to Hugging Face and want to use Hugging Face as a model host to use for CVAT annotations.
I have a Hugging Face API key, my pytorch YOLOv8 model uploaded as .pt, a basic config.json file with the id2label and model_type as yolos.
The error when attempting to run inference from the model card is Pipeline cannot infer suitable model classes from .
What do I need to do to make my model work?
1 Like
Hello,
I’m encountering a similar issue (except I’m using YOLOv9).
As far as I understand, the yolos
model type is distinct from the regular YOLO
models. I’ve tried using yolo
as a model_type
but it’s not recognized by CVAT (rest_framework.exceptions.APIException: Hugging Face error: The checkpoint you are trying to load has model type yolo but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
).
Does anyone know if custom YOLO models are supported by the CVAT integration?
If not, is there a way to bypass this, like declaring the model as another model type?
1 Like
This error often occurs when the version of Transformers is old, or when the checkpoint is corrupted or trust_remote_code=True is forgotten, but it also seems to occur in some cases due to Python 3.9.
I’m using Python 3.9.
I’m not explicitly using Transformers (should I?) so it happens on the CVAT side. Is there a way to check/change the versions used there?
1 Like
I see. So it’s an error message for when you’re using Transformers indirectly…
If it’s llamaindex, then the current version might be buggy.