When I search for anything on Google or DuckDuckGo, more than half of the results are useless AI generated articles.
Those articles are generated to get in the first results of requests, since the search engine use algorithms to index websites and pages.
If we manually curate “good” websites (newspapers, forums, encyclopedias, anything that can be considered a good source) and only index their contents, would it be possible to create a good ol’fashioned search engine? Does it already exist?
Skipping a few generations there. Manual indexing wasn’t feasible long before Google existed. Engines like the eponymous “webcrawler” would follow the links between sites and allowed searching the text of them.
Google came around later with enhancements to how the pages are ranked based on the relationship of links between pages.