3

When using AI assistant, if the thread gets too long, eventually the AI’s response will just load infinitely and a new session is required.

This has happened in every long thread I’ve typed in at some point.

The goal, of course, would be that AI responses continue to work regardless of thread length.

  • Vlad replied to this.

    Is anyone else experiencing this?

    Forgive some cynicism, but could this be by design to discourage heavy AI usage to cut costs? Not an accusation, it just seems like an odd bug. One which I’ve been experiencing since day 1 of using it.

      Personwithanaccount LLMs will get slower as the conversation gets longer, that's nothing that can be changed, but indefinitely loading is something I haven't experienced.

        Personwithanaccount It could be very likely that you've reached the token limit of GPT-4, as it has only a context size of 8192 tokens. You can check here how much tokens you've entered https://platform.openai.com/tokenizer

        I don't have any insights on how Kagi manages the context size, maybe this is a bug where it just gets stuck, because it has reached to maximum context size.

        But I'd think they would've thought about that, and throw an error there 🤷‍♂️

          Personwithanaccount

          Forgive some cynicism, but could this be by design to discourage heavy AI usage to cut costs?

          We do not do things like this. Bugs are another matter.

            No one is typing