How to do few shot in context learning using GPT-NEO

Hello,

I want to use the model from huggingface EleutherAI/gpt-neo-1.3B · Hugging Face
to do few shot learning.

I write my customized prompt, denoted as my_customerized_prompt, like this,

label:science technology
content:Google Making ‘Project Tango’ Tablets With Advanced 3D Vision: Report
###
label:science technology
content:Uber Confirms “Record Breaking” Fundraising, Interest In Driverless Ubers
###
label:health
content:‘Biggest investigation’ launched into effects of mobiles
###
label:business
content:Google Earnings Preview: What Wall Street’s Saying
###
label:health
content:OECD sees growth slowing as emerging markets lose steam
###
label:entertainment
content:Zac Efron says he’s up for more ‘High School Musical,’ and so are we
###
label:science technology
content:An angry letter to eBay: 5 questions it must answer about its security breach
###
label:entertainment
content:Kate Mulgrew doesn’t think the sun orbits the Earth
###
label:science technology
content:Lunar eclipse coming
###
label:entertainment
content:UK police probing Peaches Geldof’s sudden death
###
label:health
content:CDC: Maine tops nation in rate of long-term opiate prescriptions
###
label:entertainment
content:‘Animal House’ times 10
###
label:business
content:Google, Facebook best employers in US: Survey
###
label:business
content:No Change in Mortgage Rates at TD Bank Today - Tuesday April 15
###
label:health
content:Breakthrough hepatitis C treatment cures over 90 percent of patients with cirrhosis
###
label:poltics
content:

and I want gpt-neo to write a new content based on the last “label:politics”, for example.

So how can I put the prompt to gpt-neo.
Should I just simply give it as the usual prompt, similar to gpt-2?
like this?

from transformers import pipeline
generator = pipeline(‘text-generation’, model=‘EleutherAI/gpt-neo-1.3B’)
generator(my_customerized_prompt, do_sample=True, min_length=50)

Thanks.

3 Likes