Wake-up thoughts: we have LLM trained with tons of written knowledge so they can say things about Napoleon and heart surgery. But if I think of “something” able to formulate answers about a specific content (for example my specific products) it’s not required to know about Egyptian history.
Are there LLMs trained only to be able to “speak a language” and then they can be trained on a specific small/niche topic (or maybe just enhanced with RAG)?
Maybe it’s a silly question…
There are many LLMs of different sizes and capabilities just search for LLM models on google. The most important thing is that an LLM trained on large dataset can pick up a lot of things related to language, idioms etc., i.e. generic knowledge which is not very easy to pick up from small datasets because the “knowledge” being fed is small.
And then this well trained LLM can be fine tuned to a specific task, but of course there are approaches with smaller LLMs being trained exclusively on a special dataset.
1 Like