Hello all
I have project where i would like to train a chatbot on one author’s books. It’s a bot that will only be answering question based on the author’s books, so there’s no need for it to know how many people lives in each country and what they eat for breakfast and so many other things.
How big a model is needed, does this have any effect on how it can communicate with the user/asker ?
There might be a problem with the language, but i’m not sure, when it’s specific to only these books. The author is danish, a sort of philosopher, and i know from searching around that in general it is a problem using english trained models to speak a different language because of the differences in how we in different cultures speak and how we use our words ?
Then of course there’s hardware. I have an i5-13600K, 64GB and an RTX 3060 12GB. I know that this will limit me to maybe a 13B, at least from what i have read.
Then there’s the training, where i was advised to go with RAG, but reading back and forth, watching youtube after youtube, i’m not sure ?
The result i want, is that it ends up being a super-search know-it-all chatbot regarding what the author has written
All the best
Carsten, Denmark