Demo of Open Domain Long Form Question Answering

Hi,
I just wanted to try the demo https://huggingface.co/qa/, but it seems it is down. Could we have it again please? Thanks.

The demo is back on, thanks for letting us know!

3 Likes

Hi @yjernite, thanks for your great works and great blog post!

At the moment, the system is down again, so would it be possible to have Colab or Kaggle notebook, so that we can use our own GPU?

EDIT It’s back! So sometimes the server just down or some problem on GPU … But standalone notebook would be great anyway :slight_smile:

Hi @Jung, there is a notebook for it: https://github.com/huggingface/notebooks/blob/master/longform-qa/Long_Form_Question_Answering_with_ELI5_and_Wikipedia.ipynb. I followed the instructions on the notebook, and after several hours (donwloading, indexing, etc, and using pre-trained model) the system is ready to get question.

2 Likes

Thanks @cahya for the pointer!!
HF team indeed has many great notebooks where I missed :slight_smile:
I will definitely try it out!

Hi Yacine, @yjernite

I believe the demo has been down (I have waited for 24 hours),

https://huggingface.co/qa/
return 502 Bad Gateway

Hi, I have dockerized the engine, and provide a rest api to use it, if you want to try https://hub.docker.com/r/wirawan/eli5
I will include later a front end in to the docker image

1 Like

@cahya amazing job! Thanks for sharing. :smiley:

BTW, did you further finetune the model or use as is ?
and which beam-number did you use in the telegram bot ?

Thanks. I use the model as it is without fine tuning. I use beam number 16, with max_len of 256 to reduce the text generation time. I also use cpu for the faiss, so the whole service can fit to my 8GB GPU ram. Actually it needs now only 5GB gpu ram instead of 13GB when I used gpu for the faiss index.

2 Likes

Hi @jung @cahya, just catching up now :slight_smile:

Thanks @cahya for the work on the docker image!

@Jung the demo is supposed to restart when it crashes but that doesn’t always work, and then I have to restart it manually :wink: We’ll eventually change it so it uses our inference API to make it more stable.

I’d be super interested to know what you’re using it for in the meantime!

1 Like

Hi @yjernite, I am trying to use it with some books instead of wiki. I will write it here if it works :slight_smile: Until now, I used it in a telegram chatbot for fun (https://github.com/cahya-wirawan/AskMeAnything-Chatbot).

1 Like