https://themarkup.org/privacy/2025/08/12/we-caught-companies-making-it-harder-to-delete-your-data
Something Kagi could do to prevent those sites from the malicious behavior is making a situational exception (probably a manual override) to ensure the crawler gets every page on the site, including the opt-out page(s) they're trying to hide. Turbo Tax did this with their free file, and data brokers are doing it to make it much harder to find the "delete my data/opt out of collection or retention" page, which I consider malicious behavior. I am of the mindset that if a company is misusing standard tools like robots.txt, they don't get to benefit from them at all. It could also be configured to highlight the pages instead of just adding them to search results normally to ensure maximum discomfort for the unethical company.
“This sounds to me like a clever work around to make it as hard as possible for consumers to find it,” Schwartz said.
"While those companies might be fulfilling the letter of the law by providing a page consumers can use to delete their data, it means little if those consumers can’t find the page, according to Matthew Schwartz, a policy analyst at Consumer Reports who studies the California law governing data brokers and other privacy issues.
"After reviewing the websites of all 499 data brokers registered with the state, we found 35 had code to stop certain pages from showing up in searches."
I can see a kagi user using this by searching for an opt-out page (or unsubscribe option for services that like to hide it as well) as they would be the prime person for this feature. It could either show up as a normal result, a highlighted result, or as a search/account toggle to highlight them. This could additionally be used for Privacy Policy pages that companies try to make as hidden as possible as well.