Share your work here ✅

Show us what you’ve created using :hugs: transformers, nlp !:slightly_smiling_face: It could be a blog post, a jupyter notebook, a colab, a picture, a github repo, a web app, or anything else. Some tips:

  • Probably the easiest way to blog is using FastPages. You can easily convert your notebooks and .md files into blog posts using FastPages.
  • The easiest way to share a notebook on github is to install the gist it extension . This will only be possible if you use a platform that supports jupyter extensions, such as GCP. Otherwise, you can create a notebook gist by clicking File->Download to get your notebook to your computer, and then follow the steps from this SO post :
    1. Go to https://gist.github.com/YOUR-GITHUB-USERNAME/
    2. Click ‘New Gist’ on the upper right corner
    3. Open the folder in a Finder/Explorer window on your local computer
    4. Drag the file into the text box (the ‘code space’). This should fill the space with JSON looking text for the
      framework of the notebook content.
    5. Copy/Paste the full file name (e.g., mynotebook.ipynb) into the filename box, and give a description above.
    6. Create the Gist!
  • If you want to have folks on the forum look at a draft and give feedback without sharing it more widely, just mention that in your post
  • You can also just use a reply to this topic to describe what you did - preferably pasting in a picture or two!:hugs:
4 Likes

To start this,
Happy to announce it here first, I’ve been working on Question Generation using :hugs: transformers for past two months, and today releasing the first set of experiments here.

Question generation is the task of automatically generating questions from a text paragraph. This project is aimed as an open source study on question generation with pre-trained transformers (specifically seq-2-seq models) using straight-forward end-to-end methods without much complicated pipelines. The goal is to provide simplified data processing and training scripts and easy to use pipelines for inference.

Specifically, I trained T5 model for

  1. answer aware question generation
  2. multitask qa and qq
  3. end-to-end question generation (without answer supervision)

Here’s a sneak peek

Everything is built using :hugs: HuggingFace libraries

  • Dataset: :hugs: nlp library
  • model and training: :hugs: transformers
  • models hosted on: :hugs: model hub

For more details here’s the repo

Do share your feedback, specifically regarding the quality of questions, the mistakes and any ethical biases that you observe. Happy to discuss more details here. Cheers !

All models are available on hub with configured inference API. You can search using question-generation tag.

Here’s a colab if anyone wants to play more with it.

11 Likes

@valhalla, this could be quite useful for domain-specific QA generation from unstructured domain corpus, which is a big concern for my line of work. Thank you!

1 Like

This is amazing!

1 Like