Show us what you’ve created using transformers, nlp ! It could be a blog post, a jupyter notebook, a colab, a picture, a github repo, a web app, or anything else. Some tips:
Probably the easiest way to blog is using FastPages. You can easily convert your notebooks and .md files into blog posts using FastPages.
The easiest way to share a notebook on github is to install the gist it extension . This will only be possible if you use a platform that supports jupyter extensions, such as GCP. Otherwise, you can create a notebook gist by clicking File->Download to get your notebook to your computer, and then follow the steps from this SO post :
To start this,
Happy to announce it here first, I’ve been working on Question Generation using transformers for past two months, and today releasing the first set of experiments here.
Question generation is the task of automatically generating questions from a text paragraph. This project is aimed as an open source study on question generation with pre-trained transformers (specifically seq-2-seq models) using straight-forward end-to-end methods without much complicated pipelines. The goal is to provide simplified data processing and training scripts and easy to use pipelines for inference.
Do share your feedback, specifically regarding the quality of questions, the mistakes and any ethical biases that you observe. Happy to discuss more details here. Cheers !
All models are available on hub with configured inference API. You can search using question-generation tag.
Here’s a colab if anyone wants to play more with it.
@valhalla, this could be quite useful for domain-specific QA generation from unstructured domain corpus, which is a big concern for my line of work. Thank you!