Thanks to NielsRogge for pointing me to Transformers-Tutorials/Inference_with_GPT_J_6B.ipynb at master · NielsRogge/Transformers-Tutorials · GitHub . I made some enhancement to this and uploaded a code in GitHub - rajib76/gpt-j-example: This repository has the code to show how to add examples to GPT-J-6B similar to GPT-3 . The idea was to be able to “add examples” similar to what we can do in GPT-3 and then get the best response by running the model. I know this can be improved a lot, but this is the first version. I am hoping to get feedback/suggestions to improve this further.