When I search for anything on Google or DuckDuckGo, more than half of the results are useless AI generated articles.

Those articles are generated to get in the first results of requests, since the search engine use algorithms to index websites and pages.

If we manually curate “good” websites (newspapers, forums, encyclopedias, anything that can be considered a good source) and only index their contents, would it be possible to create a good ol’fashioned search engine? Does it already exist?

  • haverholm@kbin.earth
    link
    fedilink
    arrow-up
    14
    ·
    9 days ago

    You could try something like the huge AI blocklist for uBlock Origin. It cleans “AI” results out of Google, DDG and Bing searches.

    I’m not sure how a curated search engine would work in practice, though it’s a nice idea. It would just be an enormous task, and vulnerable to manipulation by bad actors if you crowdsource it. I’d love to be proven wrong, though.

    As @ironhydroxide@sh.itjust.works said already, a self hosted searxng instance would give you some individual curation capabilities, but wouldn’t be a “for the greater good” project as I think you might be looking for?