Numeric embedding input for transformers

Hi everyone, I want to train a self-supervised Bert model using masked language modelling, but my task involves sequential data with a mix of numerical and categorical features. While I know that categorical data is encoded using Embedding layers, I wanted to know if there is any library implementation for transforming numerical features into the latent space, which will then be fed in as input.
Maybe something similar to this paper: 2203.05556.pdf (arxiv.org)