Unknown error in model inference api and hub

I uploaded some lora models that were working fine for a while, without any changes. I could sent a prompt to api or using the model card api call and get output without any issues.

All the sudden, (started a few days ago) my model and several others that I have found use to work, are now producing this unknown error in the UI.

model’s having issue:

Anyone have any ideas?

hi @ep150de,

In order to have your models running on the community API inference, they have to be compatible with Diffusers. Looking at the source of your model here ep150de/linglenet at main, it looks like these are single file safetensor models.

How did it work previously? I managed to get it to work before without error and I noticed the issue is persistent with any loras models that previously used to work from the model card web ui.

For example, if you look at the spaces using my model, there is a test image generation space with several models and all the ones getting errors are loras fine tuned models.

https://huggingface.co/spaces/allknowingroger/Image-Models-Test232

It appears to be working again without error. Not sure what changed or if it was a backend issue.