
Knowledge
What does a LLM know about you compared to a Search Engine?
I think about how much more a LLM knows about me compared to a traditional1 search engine. I provide the llm way more information compared to a search engine since I use it way more interactively. A search engine does not provide me with results based on a previous query. The queries are always isolated and not based on each other2. That’s different with llms. I could ask for a list of restaurants in my area. And then follow up with filters for this list, e.g. asking for restaurants (from the initial list) that are open this evening and provide vegan dishes. Using a search engine, I’d start with the same first query, asking for a list of restaurant recommendations, but then I would look at the recommendations one by one by myself. I definitely could also ask the search engine in the first place for “a list of vegan restaurants” but as soon as I’d ask for “a list of vegan restaurants opened tonight” some - if not all - search engine won’t be able to provide me with valid results. Thus, the llm learns more from me than the search engine (in this example, it would know that I might eat a restaurant tonight). This is just a simple example but I think interacting with the llm let them learn so much more from us than we think. Probably, you can compare it to humans interacting with each other. If you have a conversation, reacting on the things the other person is saying, you learn more than just telling / listening without a reaction.