I’ve started relying more on AI-powered tools like Perplexity for many of my search use-cases for this very fact - all results basically warrant a pre-filtering to be useful.
Unfortunately the spam arms race has destroyed any chance of search going back to the good ole days. SEO and AI content farms means we’ll need a whole new system to categorize webpages, as well as filter out human sounding but low effort spam.
Point being, it’s no longer enough to find a page that’s relevant to the topic, it has to be relevant and actually deliver information, which currently the only feasible tech that can differentiate those is LLMs.
It would be interesting tho to use a LLM to spot AI/SEO crap and add whole domains to a search blacklist. In that case we wouldn’t need AI to do the actual search, and this could easily just be a database for end users by the SE’s side (kinda like explicit content filters).
I’d call that option “Bullspam filter” and leave it on “moderate” by default.
This is one solution to the issue, and it seems silly you are being downvoted for it.
Google became what it became, and years of seo optimisation cat & mouse play has reached new heights. Those obviously target Google instead of their competitors for now.
Would that we could have perfect search results, it would be beneficial to google as well.
I’ve started relying more on AI-powered tools like Perplexity for many of my search use-cases for this very fact - all results basically warrant a pre-filtering to be useful.
Counter point: we had good search results a decade ago and Google voluntarily eroded their product quality for a pittance of extra ad revenue.
Having a decent search engine is achievable and we don’t need to shoehorn AI into fucking everything.
I don’t disagree, but for obvious reasons, we can’t access Google from a decade ago, since they’ve made it unavailable.
I’m not really describing an ideal state, this is a mere matter of practicality.
Unfortunately the spam arms race has destroyed any chance of search going back to the good ole days. SEO and AI content farms means we’ll need a whole new system to categorize webpages, as well as filter out human sounding but low effort spam.
Point being, it’s no longer enough to find a page that’s relevant to the topic, it has to be relevant and actually deliver information, which currently the only feasible tech that can differentiate those is LLMs.
It would be interesting tho to use a LLM to spot AI/SEO crap and add whole domains to a search blacklist. In that case we wouldn’t need AI to do the actual search, and this could easily just be a database for end users by the SE’s side (kinda like explicit content filters).
I’d call that option “Bullspam filter” and leave it on “moderate” by default.
I’d call it the Slop Bucket
This is one solution to the issue, and it seems silly you are being downvoted for it.
Google became what it became, and years of seo optimisation cat & mouse play has reached new heights. Those obviously target Google instead of their competitors for now.
Would that we could have perfect search results, it would be beneficial to google as well.
I think it might have to do with the broad anti-AI sentiment that seems to be present here at Lemmy.