Why sep_token_id is same as eos_token_id for allenai/led-base-16384

I am wondering why sep_token_id is same as eos_token_id for allenai/led-base-16384 Huggingface model. It creates a confusion during inference whether </s> token refers to eos_token or sep_token, both ideally would be used in different places in text generation.

Steps to reproduce the issue:

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("allenai/led-base-16384")
model = AutoModelForSeq2SeqLM.from_pretrained("allenai/led-base-16384")
print(tokenizer.eos_token_id) # prints 2
print(tokenizer.sep_token_id) # prints 2 again