I'm facing a problem using medgemma model from the inference point

im getting this message even from model page:
Failed to perform inference: an occurred while streaming the response: Model doesn’t exist: google/medgemma-4b-it

What should i do to solve this problem?

2 Likes

For now same here.

1 Like

I reported it to Discord for now.

1 Like

I’m having the same problem, I thought it was me, but I just realized that it seems to be a general problem, maybe some news about that error?

1 Like

maybe some news about that error?

Not yet.

1 Like

Sorry, I’m new, maybe there is another way to report this error?

1 Like

another way to report this error?

Email is the official channel, but there are multiple email addresses, and I cannot confirm whether this is for bug reports. Additionally, the issue section of the hub_docs GitHub repository can also be used for general Hub error reports. website@huggingface.co feedback@huggingface.co ,…

1 Like

Thanks for reporting! We’re taking a look and I’ll update soon.

1 Like

Thanks always, Megan!

Hi everyone, Thanks for waiting - this should now be working as expected though let us know if not the case! Have a great day :hugs:

1 Like

Thank you!