bv
Or is Kagi apiring to be an source of information itself, to answer questions directly? (e.g. "Who was president of the U.S. when superconductivity was discovered?")
We do aspire to get the information user has asked for as fast as possible. Having special UX widget for that is nothing unusual. When you ask for 'palo alto weather' you get a weather widget on top of search results. When you search for 'intc stock' you get a stock widget. When you ask for time in london you get a clock.
I see an LLM answer as just a different type of search widget that serves the same purpose - augments search results with special UX that offers information faster and with better information density.
That is exactly what is going on this example and why the user loved it. Similar to weather or time widget, this is basically an LLM widget.
Now it is true that LLMs are currently in infancy and prone to making things up, but 'classic' search widgets aren't perfect by definition either. We just closed 40 bug reports with different issues with inaccurate results in time/unit conversion/calc widgets etc.
That is a lot of inaccuracies we all had to be patient about for almost two years with until we solved the problem with computational API integration (and interestingly there seems to be higher tolerance towards these kind of inaccuracies although users were arguably much more exposed to them).
So it is a process of development, trial and error in any case, and as long as we are aligned on the goal of the search engine (faster access to information user is looking for) I think it is something worth pursuing. We all know the LLMs will get better and it is reasonable to expect that at some point the error rate will get negligibly low and it seems prudent to have Kagi be ready for that moment.
Otherwise if the answer is not an LLM widget, I do not know what a better widget for displaying the answer to 'what's the connection between st mungo and shpeherd's crook' query looks like? Becuase search queries like this are not going away, if anything there will be more of them in the future, and we need an answer for it if we want to stay a relevant option on the market.