I was wondering whether we would be getting an upgrade for Mistral.AI in the assistant to Large 2, and/or the addition of a Llama model (perhaps trained for Kagi as a component of the fast query) at some point in the future. Mistral seems to have made good steps forward with Large 2 over Large in varied benchmarking. Llama has been shown to be a very capable, and adaptable model.
An introduction of Large 2 into assistant possibly replacing or complementing Large (depending on costing etc.). and the addition of Llama 3.1 (504b, 70b separately perhaps due to difference in speed and context windows).
IF Kagi would consider using Llama models to train a more capable fast query model. perhaps integrating it there as well.
Both sets seem very capable LLMs and would have use cases for different users. Kagi's done very well with offering a variety of LLMs and it would be nice to see an expansion to cover (nearly) all of the presently leading models.