I am a very beginner in Transformers and HuggingFace, I searched the forum and did not find something relevant. If I understood correctly, Transfer Learning should allow us to use a specific model, to new downstream tasks.
I will use a more specific example, say for example I load
bert-base-uncased. I see that the model can be trained on eg. Text Classification, Question answering, etc. If I wanted to run an unlisted task, say for example NER, can I somehow train the same model, or shall I tun myself to a different model that already supports this task?
If the texts are very field specific, like science or medicine, can then
bert be used, or shall I turn myself to more specific texts/models?
Those might be a bit detrimental to ask, but please share any information that could guide me.
PS: I have the impression that this post is asking a similar question.