Add dependencies for Hosted Inference API

I am trying to configure the Hosted Inference API for my model hosted at : Kieranm/britishmus_plate_material_classifier · Hugging Face

When I try to use the API it indicates “no module named timm”. I assume this means it is not installed, how and where do I add this module as a dependency for the Hosted Inference API?

I have already tried adding “timm=0.5.4” and “timm” to the pyproject.toml and a requirements.txt file respectively.

Hello @Kieranm! So at the moment we don’t allow adding custom dependencies to the Inference API, but we’ll likely be able to add timm to the FastAI inference environment soon. I can loop back to you when that’s done, and then your model should work just fine!

Good news! We’ve added timm==0.5.4 to the FastAI inference environment, so your model works now :smile:

1 Like

Amazing thanks Nima! That was so fast.

I suppose the positive of this whole thing was that it forced me to learn how to make a space in the mean time.

1 Like