GPT-Neo 125M Squad model?

I’d like t a GPT-Neo 125M Squad so I can ask questions of it.

I see a notebook (question_answering-tf) that I failed to retrofit
and another notebook that attempted it, but it was a bit confusing with all it’s gpt-3 references

I know NeoX has a version, and I know if view the GPT-Neo model or Bloomz-mt model within huggingface, it has an option to attempt to pretrain it in aws sagemaker. I got one model trained, but wasn’t able to dl it. I’d much rather prefer to run it locally.

I think I can train NeoX on forefront, but I wouldn’t be able to run it in production unless I can run it locally. Which is why I wanted to do the smaller model’s.

I know there is Happy Transformer, but I don’t see an example of using this model for Q/A.

Same question for Bloomz-mt 300M

I see there is a run_qa and a questions_answering examples notebook from huggingface, but without knowing exactly what to swap out.

I know I can pull in squad_v2 dataset from datasets, which is nice, but I’m really looking for either a prior built model, or instructions to build one. I did follow this guide last night to create a sentiment analysis one, but alas unsure how to change the instructions for QA.

there is also this link I wanted to share