Provide an API for the assistant similar to other Kagi APIs already available like FastGPT. An Assistant API would allow users to use various apps and tools that require connecting to an LLM.
All of these tools are compatible with OpenAI, so a compatible API would be the most useful. Having multiple LLMs available complicates compatibility though as you would have to select an LLM. Maybe the chosen LLM can be provided in the API endpoint URI?
Alternatively, just providing a pass-through to OpenAI's API would be great. You could pass along the request directly to the OpenAI API and tell users to consult the OpenAI API docs. In theory this would require no maintenance to the Assistant API even if the OpenAI API is updated.
Existing users of the web interface wouldn't be affected.
Some example tools that require an LLM API: paper-qa (https://github.com/Future-House/paper-qa), gpt-pilot (https://github.com/Pythagora-io/gpt-pilot), various custom chat UIs. There's a huge interest in this right now obviously so there are new tools popping up daily.
All other big LLM providers provide an API. Without an Assistant API, I'm required to buy an additional subscription to another LLM provider in addition to Assistant, and I don't want to have to pay both.
This is really a feature request targeted at all the developers out there who play around with this stuff, but I think your subscription base has a lot of those people 🙂