Bart outputing </s> in start of every decoded sentence

Any sentence that I input into the pretrained model of BartforConditionalGeneration, it would outputs a sentence with the prefix . Am I doing something wrong?
For example, if I input “Obama is the best president”, it gives back </s.><s.>Obama is the greatest president ever</s.>…Same is the case with every input sentence.
TIA

This is expected behavior, if you see BartforConditionalGeneration docs, the decoder_start_token_id is the EOS token by default (which is ). This is in contrast to other models, a BERT decoder for instance has a CLS token as the starting token given to the decoder.