Yawn. This is probably The hundredth time I’ve seen this scenario trotted out and knowledge base retrieval and interpretation has been solved since before bing chat was on limited sign up.
You don’t even need to fine tune a model to do this, you just give it a search API to your documentation, code and internal messaging history. It pulls up relevant information based on queries it generated from your prompt and then compiles it into a nicely written explanation with hyperlinked sources.
I'm not sure I follow. Knowledge based retrieval of what exactly? Outdated docs and dilapidated code? Aging internal wikis and chat histories erased for legal reasons?
Everyone also seems to overlook how much time and resources it takes for these models to be trained/fine tuned on a corpus of knowledge. Researches have calculated it probably took OpenAI the equivalent of 355 years on a single NVidia V100 to train up GPT 3. [1] Clearly they used more horsepower in parallel, which is a foreseen problem right now for other reasons. [2]
You don’t even need to fine tune a model to do this, you just give it a search API to your documentation, code and internal messaging history. It pulls up relevant information based on queries it generated from your prompt and then compiles it into a nicely written explanation with hyperlinked sources.