57

I can't use a search engine without uBlacklist anymore. I've been spoiled. There's just way too many garbage sites I don't want to see. I'd like to use Kagi, but I can't until a uBlacklist-like feature exists.

If you decide to support a uBlacklist-like feature, please consider this scenario:
I block a lot of sites; over 100,000. Some of my search results yield 0 results on the first page. Without a userscript, I have to browse to the second or even third page to see any results. The first few pages are filled with garbage I've filtered out. Loading more results automatically with a userscript works, but I'm consistently flagged as a bot and have to complete captcha after captcha.

If I could import my uBlacklist lists and non-filtered results were nicely compiled on the first page, I'd start using Kagi.

    sle4zy84nq Can you give a few examples of how Kagi results could be better if you had the block list available? In general we are told that the results are pretty good out of the box.

      It's not about being "pretty good", and that's subjective anyway. It's about filtering out what you don't want to see. What if I don't want to see results from twitter.com? Some might consider this a good result, others never want to see it.

      I search all day while troubleshooting. There are countless AI-generated trash sites. Before uBlacklist, I had to scroll through all these trash sites. Now, if I think a website is garbage, it goes on the blacklist; never to be seen again 🙂

      It's so nice to click one button and to never see that domain in search results again. I update a master list with block rules. All my pc's and even my phone update from this list. It's so nice.

        sle4zy84nq I guess I did not understand this:

        I can't use a search engine without uBlacklist anymore

        To my knowledge no search engine supports uBlacklist directly. The closest that exists is Kagi's blocked domains.

        What prevents you from using uBlacklist on Kagi, same as on other search engines?

          Vlad
          From the OP:

          ...the develop of uBlacklist has little motivation to add support for other search engines.

          • Vlad replied to this.

            sle4zy84nq OK I understand the problem is uBlacklist not supporting Kagi. No search engine supports uBlacklist and the ask is for us to do so.

            We can not expand our backend now to support something like 100,000 long blacklist. The cost of processing this per for every query would be non-trivial as well as there would be a latency penalty.

            I understand uBlackist extension is open-source - would you be open to submitting a patch to support Kagi's front-end?

              3 months later

              Vlad I don't want the backend to handle the blacklist. As a client-side function, it is enough to be able to read the filter of uBklacklist, check for updates, reflect this, and manage the filter.

              Moreover, as a condition to implement this function, wouldn't the contents of the filter be used for machine learning in the garbage domain, which will lead to kagi's enhancement?

              User evaluation can strengthen the sales talk of not displaying garbage in search results.
              It doesn't matter if you provide the whole thing. -> https://github.com/Chamiu/Search-Block-Parasiticide

              The current implementation of search results that are known to be garbage must be re-evaluated one by one is troublesome.

                There're a lot of curated spam-site lists on the internet already. For initialization, user can add these lists to blocklist for a good start.

                I think it would be much better to be able to:

                • batch edit blocked/lowered/raised/pinned sites
                • search among these customized rules
                • subscribe to external lists

                I notice that after I added 800 sites, the site https://kagi.com/settings?p=user_ranked loading speed drops significantly


                With extension, for example uBlacklist, user can easily subscribe to curated site lists and block sites from google search results. It is the advantage of Kagi to have this feature enhanced and built into the search engine itself. But the feature can be even more enhanced.

                Thus, another advise. Add a notice in the search result page shown how many site has been blocked, and let user click on a button to show what has been blocked. This can not only give use an insight that block is really working to enhance the experiment, but also let user to temporarily regret and check the blocked sites (because maybe the blocked sites can provide decent results sometimes imo, such as pinterest, geeksforgeeks). Btw, uBlacklist can do that.

                • Vlad replied to this.
                  Merged 1 post from Bulk edit for blocked/lowered/raised/pinned sites, and the search feature for these sites.

                    Fernandez Please keep it to one suggestion per post. Can you create a new one for the other one?

                      It seem that a significant amount of ".it" domains consist entirely of spam, most likely due to them being offered for free. Such spam domains have appeared at least once in Kagi search results.

                      User level: Since Kagi cannot tell if such domains are spam (at the moment), one way to prevent these domains from appearing in search results of users is to allow users to block all domains whose suffix is ".it". This should be a feature, since the landscape of the internet will change and the domains that are favored by spammers will shift as time passes. The user adding that domain suffix to their blocklist clearly finds that blocking certain false positives is worth the reduction of spam in their search results.

                      Service level: Just like uBlock Origin has certain "default" blocklists that it suggests to new users, Kagi can also curate blocklists they believe will be useful to the users. I'm uncertain as to the specifics of how this should be implemented, since that depends on Kagi and the community.

                      • Vlad replied to this.

                        Supporting the uBlacklist format would be fantastic! Adding the ability to export in the same format would be nice too, I maintain a GitHub repo of my uBlacklist rules.

                        In addition to the .it domain spam, I've recently started seeing .pl spam following the same pattern of randomized domain names.

                          Steps to reproduce:
                          Try various queries? I don't have one on hand that's causing this result at the moment...

                          Even when coming from the US and searching in English, you may see spammy results from various .pl domains, including:

                          • ngmd.roofmasters.pl
                          • ejslxb.poradnik-kuchenny.pl
                          • ubl.stacjakomputerowa.pl
                          • mithc.lkstworkow.pl
                          • lwz.tomexplast.pl
                          • rvlir.moto-arena.pl
                          • rdgep.karczma-raznawozie.pl
                          • wgvwys.warsztat-kulinarny.pl

                          Expected behavior:
                          Results shouldn't include spammy .pl domains.

                          Debug info:
                          Firefox 103.0.1 on macOS 12.5. Kagi server US-EAST.

                            Merged 2 posts from Spam from .pl domains in results.
                              a year later

                              I ran into the same .pl random domain spam earlier today. Being able to block TLDs would be fantastic

                                3 months later

                                Am I right that as a workaround, until TLD blocking is implemented, it is possible to create a Lens that excludes certain TLDs from search results?

                                • Vlad replied to this.

                                  9rqnq That could work, and is worth a try.