Inference API down?

@nielsr I have debugged the problem further. The endpoints work as long as they are public. Both with and without scale to zero. If I secure the endpoint and request it without a token, a 401 is returned. So far so good. But if I pass a valid token, I get a 500. Do your integration tests work?