Idea:-A unique task wherein we train a GPT-2 model for contextual common sense reasoning using the COSMOS QA dataset.The goal is the test the GPT-2 for the common sense reasoning case.
Model:-Need to add support for ‘FlaxGPT2ForMultipleChoice’ model which can be easily done using the existing ‘FlaxGPT2Model’.
We need some guidance for the project,we would love to hear feedback(@patrickvonplaten and @valhalla please do have a look) and we are very excited to contribute this task to huggingface.
We are currently a team of two:-1.Rohan.V.Kashyap(@Rohan)
2.Vivek.V.Kashyap(@Vivek)
Thanks @patrickvonplaten,we are very much looking forward for this.We are confident of getting this done and hope more people can join our project and contribute for it.
@patrickvonplaten it would if you could can our project to the google sheets,we would like to make this a part of huggingface,we are be absolutely excited to get started with this.
Hi @patrickvonplaten ,we are working on it.We have finished writing our training script for Bert multiple choice using flax and also our cosmosq&a dataset.it’s working perfectly fine.we just need access for the tpu and accomodate training the same with the gpt-2 multiple choice model.That should get completed by tomorrow.We have created the discord channel also.