How to use Huggingface model for continuous values directly?

Hi, I have a dataset which contains continuous values [ batch_size, features ]

Features look like this :

[0.49221584, -0.021571456, -0.0920076, -0.14408934, -0.62306774]

I want to apply transformer model on these values and pass it to the final layer, something like this

batch_data ==> Transformer ==> output_layer ==> classification

Currently, I am using hand-coded multi-head attention and norm with the feed-forward network to pass these values to the transformer block.

I went through huggingface models, but all the models accept tokens and sequences, Is there any way/hack How I can use hugging face transformer models on direct continuous values?

Looking for TensorFlow/Keras quick template to start with continuous values, that’d be helpful.

I think @patrickvonplaten’s suggestion in https://github.com/huggingface/transformers/issues/6608 to use input_embeds would be the way to go here.

Or you could also try simply bucketizing your continuous values into buckets and embedding them, like tokens.

1 Like

@julien-c Any template would be a great help.