Longformer Token Length

When embedding documents w/ more than 4096 tokens, does Longformer automatically creates chunks of 4096 tokens for a document (until all words are represented as tokens), where each chunk will be a tensor?