MrAlexey The internet has become a bloated mess. Massive JavaScript libraries, countless client-side queries and overly complex frontend frameworks are par for the course these days. When online newspapers like The Guardian are over 4MB in size, you know there's a problem. Why does an online newspaper need to be over 4MB in size? It's crazy. More: https://github.com/kevquirk/512kb.club Hi! It would be super helpful if we could search for fast and optimized web sites Most naive approach: in search results just add to page a badge with bytes size of the page allow filtering results based on that size
Anonymous12 Neat idea! I wonder if this could be included/captured as part of Kagi Small Web. The content there looks very Small Web.
RoxyRoxyRoxy Hmm, not sure if we get the size from upstreams. Can check if this is possible somehow but kinda doubt it atm. Would likely have to be a Small Web thing.
tux0r RoxyRoxyRoxy Hmm, not sure if we get the size from upstreams. Doesn't Kagi already fetch upstreams for tracker detection? Reading the content length header could be a part of that.