Would probably make more sense to use a token limit instead of a character limit.
At least for the OpenAI models this should be fairly easy using something like tiktoken.
Maybe also show the used tokens even if not over the limit.
It would give users a clearer understanding in how much they are using when entering a prompt.