Find dependencies between words in a sentence

I need to find connections between words in a sentence, like this (spacy lib). How can i achieve these results with deep learning? I don’t really understand how hugging-face transformers work, because this library lean on a “self-attention” mechanism, which is quite a mystery for me. Maybe i should stick to RNN, but i don’t know what kind of properties (words, lemmas, morphemes) i should pass to the NN, and how to vectorize it.

I created some dataset sample, where i store each word, its POS, tense, gender, case, plurality/singularity (0 if doesn’t have this property), word’s parent (0 if it’s sentence root)

I have got a few questions:

  1. What would be an appropriate size of a dataset for this problem? In sentences
  2. What kind of a model do i need to solve this and how this model learns?

I can’t figure it out, so please describe everything in as much detail as possible. Thank you!