• Sergio@slrpnk.net
      link
      fedilink
      English
      arrow-up
      77
      ·
      3 days ago

      They’re still in the first stage of enshittification: gaining market share. In fact, this is probably all just a marketing scheme. “Hi! I’m Crazy Sam Altman and my prices are SO LOW that I’m LOSING MONEY!! Tell your friends and subscribe now!”

      • skittle07crusher@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        24
        ·
        edit-2
        3 days ago

        I’m afraid it might be more like Uber, or Funko, apparently, as I just learned tonight.

        Sustained somehow for decades before finally turning any profit. Pumped full of cash like it’s foie gras by Wall Street. Inorganic as fuck, promoted like hell by Wall Street, VC, and/or private equity.

        Shoved down our throats in the end.

    • where_am_i@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 days ago

      well, yes. But also this is an extremely difficult to price product. 200$/m is already insane, but now you’re suggesting they should’ve gone even more aggressive. It could turn out almost nobody would use it. An optimal price here is a tricky guess.

      Although they probably should’ve sold a “limited subscription”. That does give you max break-even amount of queries per month, or 2x of that, but not 100x, or unlimited. Otherwise exactly what happened can happen.

      • V0ldek@awful.systems
        link
        fedilink
        English
        arrow-up
        19
        ·
        3 days ago

        “Our product that costs metric kilotons of money to produce but provides little-to-no value is extremely difficult to price” oh no, damn, ye, that’s a tricky one

        • Saledovil@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          6
          ·
          2 days ago

          What the LLMs do, at the end of the day, is statistics. If you want a more precise model, you need to make it larger. Basically, exponentially scaling marginal costs meet exponentially decaying marginal utility.

            • self@awful.systems
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 days ago

              guess again

              what the locals are probably taking issue with is:

              If you want a more precise model, you need to make it larger.

              this shit doesn’t get more precise for its advertised purpose when you scale it up. LLMs are garbage technology that plateaued a long time ago and are extremely ill-suited for anything but generating spam; any claims of increased precision (like those that openai makes every time they need more money or attention) are marketing that falls apart the moment you dig deeper — unless you’re the kind of promptfondler who needs LLMs to be good and workable just because it’s technology and because you’re all-in on the grift

              • Saledovil@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                6
                ·
                2 days ago

                Well, then let me clear it up. The statistics becomes more precise. As in, for a given prefix A, and token x, the difference between the calculated probability of x following A (P(x|A)) to the actual probability of P(x|A) becomes smaller. Obviously, if you are dealing with a novel problem, then the LLM can’t produce a meaningful answer. And if you’re working on a halfway ambitious project, then you’re virtually guaranteed to encounter a novel problem.

                • self@awful.systems
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  2 days ago

                  Obviously, if you are dealing with a novel problem, then the LLM can’t produce a meaningful answer.

                  it doesn’t produce any meaningful answers for non-novel problems either

      • confusedbytheBasics@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I signed up for API access. I run all my queries through that. I pay per query. I’ve spent about $8.70 since 2021. This seems like a win-win model. I save hundreds of dollars and they make money on every query I run. I’m confused why there are subscriptions at all.