Kagibeh Well, think again. With the majority of humans doing all they can to make the lives of other human beings as miserable as possible, I think everyone should have the right to put an end to their misery as they see fit.
According to the WHO (https://www.who.int/news-room/fact-sheets/detail/suicide):
While the link between suicide and mental disorders (in particular, depression and alcohol use disorders) and a previous suicide attempt is well established in high-income countries, many suicides happen impulsively in moments of crisis with a breakdown in the ability to deal with life stresses, such as financial problems, relationship break-up or chronic pain and illness.
Yes, in some cases (ie, chronic or terminal illness), the person will have thoroughly considered all the options and, with a clear conscience, conclude that suicide is the best option. But for many (someone already mentioned this above, but I think it needs to be said again), the person who committed suicide was not in a clear state of mind in the moments leading up to their death.
Kagibeh As you see, none of what you are suggesting is as universal as you're framing it - it is simply your US-based opinion. Of course I'd like to live in a better world, and less suicides might be the fallout of that, but starting out by claiming that killing yourself is wrong is getting things backwards once again.
Firstly, I'm not American. Secondly, I don't think the idea that we should help those who are suicidal find ways to alleviate their suffering that don't result in them taking their own lives is a solely US-based one.
Kagibeh They wouldn't? So someone who searches for french fries, processed meats or softdrinks is not looking how to get a heart attack? This is odd, because they are certainly receiving the perfect instructions.
People looking up "cheeseburger" very likely aren't thinking to themselves, "Ah yes, I want to get a heart attack". People looking up "how to kill self" very possibly are thinking of ending their lives right then and there.
Kagibeh If you're looking for pharmaceutical investments, your search results may not teach you "how to get AIDS", but they might as well be a blueprint of how to let others die from AIDS, as providing help to everyone who needs it happens to be completely detrimental to stock market value. Of course, you could also search for "Mahagoni floors" if you'd like to find out "how to further destroy the rain forests and ruin planet Earth". Should I continue? Because I could do this all day.
Since you seem to be a betting person, I would take a wager that bad consumer decisions are by far the biggest killer on this planet. How about it? Why not ask Kagi to post a warning that "Shopping kills" on every shopping-related query? Kagi isn't going to employ that either (and rightly so), but I'd certainly applaud you for consistency.
You're comparing apples to oranges here. Things like capitalist exploitation and environmental degredation are topics whose problems and solutions go far beyond the scope that search engines have. Suicide, on the other hand, is something that search engines like Kagi can have an immediate impact in preventing. Also, there are already search engines (like Ecosia) that cater towards people who care deeply about those issues. Which brings me to my last point...
Kagibeh Without consistency, the line between the unacceptable (french jokes) and the acceptable (institutionalized exploitation, destruction and mass murder) will always be a deeply ideological one, and once again I am glad that Kagi isn't going anywhere near it.
geeknik I think every search result should have it's own tailored "nanny state" disclaimer. The AI infrastructure is already in place, so it should be easy. I mean, we can't have the populace thinking for themselves, can we?
In post #95, I mentioned this potential solution (that I believe was brought up before in this thread as well):
Allow user to determine which widgets they wish to see with toggle + self- or publicly-defined widget lists.
This would allow the user to set for themselves what type of behaviour they wish Kagi to have for search results according to whatever lines of acceptability they have in their own minds.
Oh btw, Kagi already has filters on by default. So there is a line that Kagi has between what is acceptable and what is not ("sensitive material from general search results", which at the very least includes NSFW content and potential malware as confirmed in this thread). The other solution I mentioned was:
Add content encouraging suicide into NSFW/"Safe Search" filter. (if it isn't in there already)
I am all for freedom of choice, and I believe either of these potential solutions strike a balance between those wanting safeguards and those wanting more personal autonomy with their search experiences. I encourage both of you to read that post.