• EtnaAtsume@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 days ago

    Servers must be having a hell of a time since I can’t get a verification code sent to my email 😆

    • wndy@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 days ago

      Like, it’s not even THAT much better. I mean, not so much so that everyone should flood it lmao. The main plus was no restriction on tokens used, but that’s useless when it’s getting overloaded all the time.

      I would say it’s just barely noticeably better than the free tier of GPT. Which makes it a little annoying to go back but w/e.

        • wndy@awful.systems
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 day ago

          Not people who can’t afford 100k to spin up their own servers. It’s going to be a game changer for AI startups and such though who won’t have to spend as much as previously thought.

          edit: Basically, numbers out of my ass, but it’s like they reduced the amount you have to spend to get chatgpt-level output from $500k to $100k. Amazing and all, definitely newsworthy, but uh… not directly relevant for us little folk, more about the ripple effects

        • Architeuthis@awful.systems
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          The 671B model although ‘open sourced’ is a 400+GB download and is definitely not runnable on household hardware.