Effect of punctuations on Transformer models

How bert-like or sentence-transformer-like models are affected by punctuation marks like ‘.’ and ‘\n’ characters?

Will there be any impact on sentence/word embeddings if we ignore punctuation’s vs we include? Do Punctuation also gets embedded as separate token?