I feel that people are completely unrealistic about the costs of providing an API to "access their credits".
In a previous post Vlad stated that search "Costs us about 1.25 cents per search". If that search count lines up with what is shown in the billing portal then I was burning over $30 in search costs a month for a $10 subscription. And that didn't include any LLM costs.
I bumped my subscription up to $25 only to find I burned $1 in the first half of the first day as !code now maps to Claude Sonnet 4 in the $25 plan instead of Kimi K2 as it does in the $10.
This seems like a huge mistake. The Ultimate plan costs are simply unsustainable.
The costs of Anthropic models like Claude Sonnet 4 are absurd. OpenRouter shows Sonnet 4 costing $3-$6/M input tokens and $15-$22.50 output. Meanwhile Kim K2 0905 on OpenRouter ranges around $0.50-0.75/M on input and $2-$2.50/M. No wonder Ultimate users are net negative for Kagi the minute they touch the assistant.
Also keep in mind Anthropic and OpenAI are charging these insane API costs and subscription plans costing hundreds of USD a month that still get rate limited. And these two companies are still burning cash and wildly unprofitable.
Look up Edward Zitron's article "Why Everybody is Losing Money On AI". These prices are subsidized and will get higher over time as they run out of other people's money to burn. And while the cost of some unit of inference on a GPU might be going down there are two trends that will make it more expensive. The new models are generating so many reasoning tokens that you will pay for and the cost of electricity in the U.S. is going up and will continue to go up. Kagi is a reseller of LLM APIs here. The API providers will have to raise costs dramatically on Kagi to reach profitability themselves.
I don't see any future were unlimited AI assistant works for Kagi without it focusing on local hosting of smaller open weight models to keep costs as low and controlled as possible. API access will have to be pay per call.
I use Kagi assistant because the search integration is amazing and I prefer LLM models to use search results to answer my question with citations for everything instead of any "world knowledge" baked into the LLM.
I don't see how this functionality is able to be offered long term with a measly $25 plan.
I'd love to see a pay as you go search API where Kagi makes a small profit with each call. And I fully expect any future subscription plans with the Assistant UI to only offer X base credits and you'll have to pay extra for more.