• gedaliyah@lemmy.world
    link
    fedilink
    arrow-up
    77
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Yes. If you do a search on this site for posts about google, you’ll find multiple threads about this. Basically it seems that google is losing the arms race against SEO, and new LLM bots are mostly responsible.

    • Rayspekt@kbin.social
      link
      fedilink
      arrow-up
      21
      ·
      10 months ago

      Basically it seems that google is losing the arms race against SEO[…]

      What does this mean in particular?

      • Deceptichum@kbin.social
        link
        fedilink
        arrow-up
        51
        arrow-down
        1
        ·
        edit-2
        10 months ago

        Companies are better at getting their shitty product/spammy pages to the top of search results than Google is able to find high quality pages to show you.

        Google has to create algorithms to judge pages based on their content and get good results , companies only have to fine tune their pages to match the algorithm.

        • GustavoFring@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          10 months ago

          IIRC Google penalizes sites if they are detected to be abusing the SEO system. Not sure how effective the detection is though.

          • idunnololz@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            10 months ago

            It’s actually very tricky to implement because people have used it to do “negative SEO”. Which is essentially making it seem like your competitors are abusing the system to get their results lowered.

      • Thorry84@feddit.nl
        link
        fedilink
        arrow-up
        21
        ·
        10 months ago

        People with mediocre content using SEO to get themselves higher in the search results than sites with actual information on them. That way when searching for something you need to dig through the shit to get to the nugget of actual useful info.

        Search engines tried to rank pages based on how big the chance is the info the user is looking for is actually on that page. SEO makes it so that pages with a lower chance of containing the right info, are ranked before pages with a higher chance. This leads those pages to get more hits and thus marketing thinks it’s done their job. But in reality it just pisses off users, blaming the dumb sites that do this and more often the search engine. Search engines are trying to fight this, but SEO is big business, so they are losing the battle.

        Now these days there are more issues, like search engines not having access to a lot of info in so called walled gardens. So more good info gets created in places where it can’t easily be found. Also search engines have become more and more advertisement machines instead of search engines and with this shift in priorities, the user experience deteriorates.

        But yeah SEO sucks and has always sucked.

      • gibmiser@lemmy.world
        link
        fedilink
        arrow-up
        15
        ·
        10 months ago

        Search Engine Optimization. Not providing the best search result, but tricking the search engine into thinking you have the best result.

      • nottelling@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        10 months ago

        It means if you search for anything, your first 3 pages of hits are the same useless websites that exist to push ads vaguely related to your search rather than real info. Trying to research a broken TV used to return things like AVForums or reddit threads or samsung support sites. Now it’s “TEN BEST TV’s IN 2024” that are nothing but sponsored content and affiliate links to tvs on amazon.

        Google can’t figure out how to tell the difference between the former and the latter, and isn’t motivated to because they get paid for the ad clicks, and not for the forum clicks.

      • Valmond@lemmy.mindoki.com
        link
        fedilink
        arrow-up
        10
        ·
        10 months ago

        If google detected continuing searching after a page visit, then that page you looked at was probably not having the right answer, right?

        SEO solution : make super long pages with the history of what you are looking for and adding mumbo jumbo stuff to bloat out the page so you stay there longer. Now google thinks you found what you looked for.

        And a lot of other crap ofc.

        • Rayspekt@kbin.social
          link
          fedilink
          arrow-up
          6
          ·
          10 months ago

          Ahhh, that’s the goal behind those overly long explanations about how Jimbo Jimboson invented the spoon when I just want to look up soup recipies.

      • rtxn@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        10 months ago

        SEO (search engine “optimization”) is how a search engine ranks its results. The more webpages link to a certain result (as determined by a webcrawler), the higher it is displayed. That is why bloggers are often paid by bad actors to publish editorials that link to a scam, virus, or gambling website.

        Google popularized the concept in its early years, back when SEO was an organic indicator of a result’s popularity. It made them the single best choice. Then, capitalism happened, and SEO became a resource to exploit.

        https://www.youtube.com/watch?v=3bARSNVobUk

      • vexikron@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        10 months ago

        SEO stands for Search Engine Optimization.

        The technicals of all the ways this is done nowadays are complicated, but basically SEO itself is now a pretty huge industry, just website owners paying SEO companies to show up higher in search results.

        Basically the scenario we are now in is that companies that can afford to game and manipulate the way google’s search algorithm works in terms of prioritizing ‘relevance’, ie, what you see first, have been so successful at this that it has essentially ruined the ability to find any website that cannot afford to do so.

        This would be something like 99.999% of existing websites are going to be much harder to find without going through pages and pages of results, whereas a tiny number of websites that can afford massive SEO are going to show up on the first page, as well as in search results for search terms they are barely related to at all.

    • Ultraviolet@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      Google chose to ignore the SEO arms race. Winning it is trivial, if you detect anything even remotely grey-hat, blacklist the entire domain. Forever. Then SEO stops being a thing because no one wants to risk toeing the line.

    • ThunderWhiskers@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      10 months ago

      What incentive does Google have to even put up a fight? Worse results = more time searching = more “traffic” = more ad revenue. It’s not like they really have to worry about search engine competitors. Please do not try to recommend DDG to me. It is just a different flavor of garbage.

      • gedaliyah@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Yes, there’s a great article I read a while back about how and why recipe web sites became so bad and frustrating. Basically, a good website would show you the recipe, you would read it and leave. However, since you didn’t spend much time on the site, google would rank it much lower.

        On the other hand, if you encountered a long rambling story that you had to read through and scroll through ads, before getting to the useful information, then google would rank the site higher because you spent more time on it. That’s why there are so many memes about how bad recipe website are.

        And of course, even before LLMs, it was trivial to implement a copy paste bot to create a massive number of web sites.