Cross-encoder inference API DOWN?

While accessing this(cross-encoder/ms-marco-MiniLM-L-12-v2) model via Inference API, I am getting the following error -

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host=‘huggingface.co’, port=443): Max retries exceeded with url: /api/models/cross-encoder/ms-marco-MiniLM-L-12-v2 (Caused by NewConnectionError(‘<urllib3.connection.HTTPSConnection object at 0x7f37ef085c70>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution’))

Can someone provide more insight into this error? And how do we solve it?

Similar issue is also viewed while trying for the cross-encoder/ms-marco-MiniLM-L-6-v2 model inference api too.

1 Like

From the error content, I think it’s just overuse. Make sure to pass a read token at the time of request or simply wait an hour and the restriction will be relaxed.