I am trying to make a new non-pretrained XLMProphetNetForConditionalGeneration with a custom number of layers. However, when trying to do a forward pass it doesn’t work. The code to reproduce the problem is below.
from transformers import XLMProphetNetTokenizer, XLMProphetNetModel, XLMProphetNetConfig tokenizer = XLMProphetNetTokenizer.from_pretrained('microsoft/xprophetnet-large-wiki100-cased') input_ids = tokenizer("Studies have been shown that owning a dog is good for you", return_tensors="pt").input_ids decoder_input_ids = tokenizer("Studies show that", return_tensors="pt").input_ids model = XLMProphetNetForConditionalGeneration(XLMProphetNetConfig()) outputs = model(input_ids=input_ids, decoder_input_ids=decoder_input_ids, return_dict=True)
Can someone tell me what I’m doing wrong here. Can I not initialize a model like this because it works for others like GPT2.