Vlad
From the OP:
...the develop of uBlacklist has little motivation to add support for other search engines.
Vlad
From the OP:
...the develop of uBlacklist has little motivation to add support for other search engines.
sle4zy84nq OK I understand the problem is uBlacklist not supporting Kagi. No search engine supports uBlacklist and the ask is for us to do so.
We can not expand our backend now to support something like 100,000 long blacklist. The cost of processing this per for every query would be non-trivial as well as there would be a latency penalty.
I understand uBlackist extension is open-source - would you be open to submitting a patch to support Kagi's front-end?
Vlad I don't want the backend to handle the blacklist. As a client-side function, it is enough to be able to read the filter of uBklacklist, check for updates, reflect this, and manage the filter.
Moreover, as a condition to implement this function, wouldn't the contents of the filter be used for machine learning in the garbage domain, which will lead to kagi's enhancement?
User evaluation can strengthen the sales talk of not displaying garbage in search results.
It doesn't matter if you provide the whole thing. -> https://github.com/Chamiu/Search-Block-Parasiticide
The current implementation of search results that are known to be garbage must be re-evaluated one by one is troublesome.
There're a lot of curated spam-site lists on the internet already. For initialization, user can add these lists to blocklist for a good start.
I think it would be much better to be able to:
I notice that after I added 800 sites, the site https://kagi.com/settings?p=user_ranked loading speed drops significantly
With extension, for example uBlacklist, user can easily subscribe to curated site lists and block sites from google search results. It is the advantage of Kagi to have this feature enhanced and built into the search engine itself. But the feature can be even more enhanced.
Thus, another advise. Add a notice in the search result page shown how many site has been blocked, and let user click on a button to show what has been blocked. This can not only give use an insight that block is really working to enhance the experiment, but also let user to temporarily regret and check the blocked sites (because maybe the blocked sites can provide decent results sometimes imo, such as pinterest, geeksforgeeks). Btw, uBlacklist can do that.
It seem that a significant amount of ".it" domains consist entirely of spam, most likely due to them being offered for free. Such spam domains have appeared at least once in Kagi search results.
User level: Since Kagi cannot tell if such domains are spam (at the moment), one way to prevent these domains from appearing in search results of users is to allow users to block all domains whose suffix is ".it". This should be a feature, since the landscape of the internet will change and the domains that are favored by spammers will shift as time passes. The user adding that domain suffix to their blocklist clearly finds that blocking certain false positives is worth the reduction of spam in their search results.
Service level: Just like uBlock Origin has certain "default" blocklists that it suggests to new users, Kagi can also curate blocklists they believe will be useful to the users. I'm uncertain as to the specifics of how this should be implemented, since that depends on Kagi and the community.
mesaoptimizer We should probably just support ublacklist format
https://github.com/iorate/ublacklist#description
such as
https://raw.githubusercontent.com/arosh/ublacklist-stackoverflow-translation/master/uBlacklist.txt
Supporting the uBlacklist format would be fantastic! Adding the ability to export in the same format would be nice too, I maintain a GitHub repo of my uBlacklist rules.
In addition to the .it domain spam, I've recently started seeing .pl spam following the same pattern of randomized domain names.
Steps to reproduce:
Try various queries? I don't have one on hand that's causing this result at the moment...
Even when coming from the US and searching in English, you may see spammy results from various .pl domains, including:
Expected behavior:
Results shouldn't include spammy .pl domains.
Debug info:
Firefox 103.0.1 on macOS 12.5. Kagi server US-EAST.
I'm experiencing the same issue. There's a planned feature to add uBlacklist blocklist support, which includes blocking entire TLDs.
I ran into the same .pl random domain spam earlier today. Being able to block TLDs would be fantastic
Am I right that as a workaround, until TLD blocking is implemented, it is possible to create a Lens that excludes certain TLDs from search results?
Currently Kagi can't compete with extensions like uBlacklist, when it comes to domain blocking. Trying to handle large amounts of domains is currently very difficult due to the lack of support for bulk editing and removal of domains. The ability to import public rulesets, similar to Subscriptions in uBlacklist, would make the sharing and editing of domain blacklists/whitelists much easier.
For full compatability with existing blacklists, Kagi would need support for Match Patterns and Regular Expressions. These two features by itself, without the support for importing lists, would already make Personalized Results in Kagi a lot more powerful.
Currently it is not uncommon to have large amounts of repeated domains in the Personalized Results, such as:
pinterest.com
pinterest.at
pinterest.be
pinterest.ca
pinterest.co
pinterest.ch
pinterest.cl
pinterest.de
...
Instead of manually adding these domains, users could import public lists like this. Lists like this could also be simplified to a single entry using RegEx:
/^https:\/\/\.pinterest\./
These options would make it much easier to handle large amounts of URLs.