I have an RTX 2080ti.

I still play in 1080p 60Hz, and the 2080 is plenty. But I’m looking to train some ML models, and the 11GB VRAM is limiting for that.

Thus, I plan to buy a new one. Also I don’t want a ML only GPU since I don’t want to maintain two GPUs.

Since I’m upgrading, I need to think of future compatibility. At some point I will move to at least 2k, although still I’m not bought into 4k as any perceivable benefit.

Given all these, I wanted to check with folks who have either card, should I consider 4090?

    • Thurgo@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I just went from a 3570k to a 12600k and will run this 1080 till it can’t run no more.

  • simple@lemmy.mywire.xyz
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I bought a 4090 just to run LLM and Stable Diffusion, with some occasional gaming. But if you’re just use it for ML, get whatever is cheaper (ironically I found 4090 cheaper than 3090 when shopping around).

    • TheTrueLinuxDev@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      7900 XTX recently got support for Stable Diffusion and LLM, on paper, it’s faster than 4090 RTX for FP16 computation, it does seem faster judging my experience using rented 4090 RTX on Runpod and my 7900 XTX GPU. 14 seconds (4090 RTX) vs 6 seconds (7900 XTX.)

      7900 XTX is an option if you want $1000 cheaper than 4090 RTX and have similar sized VRAM and having comparable performance to that of 4090 RTX.

    • MoonRocketeer@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I’m doing summer research with a focus on ML. I just built my computer and picked AMD because of the price, but did not now that Nvidia was the one to pick at the moment if that’s what I wanted it for. I don’t know enough about hardware and could use the school labs anyway, but I should have done better research (ironic heh).

  • KingRandomGuy@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    How much ML training will you do, and what kind of models? Are you just a hobbyist, or are you a student or researcher in ML?

    If the former, you may be better served by renting a machine for training instead. Vast.ai is one such service for this and you can rent machines with a 4090 for something like 50 cents an hour. For hobbyist stuff this usually ends up cheaper than buying a whole card, especially if you find out that you need multiple GPUs to train the model effectively.

    If you’re a researcher though, a 3090 might be a good buy. IMO the gains from a 4090 won’t be too crazy unless you’re doing specific mixed precision stuff (the newer gen tensor cores support more data types). Be aware that the large models that necessitate 24GB of VRAM usually require many GPUs to train them successfully in a reasonable amount of time, so having a large VRAM GPU is moreso useful for quick development and debugging rather than training large models, in which case the 4090 wouldn’t be all that much better.

  • acedelgado@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I game at 3440x1440 ultrawide, and upgraded from a 3090 to 4090. 4090 is significantly faster and smoother. And DLSS3 frame interpolation is no joke- in Hogwarts Legacy with every setting cranked up and max ray tracing, turning on DLSS3 jumped me from around 80fps with noticeable 1% low stutters and now pegs it at the 144fps limiter I set in game. Smooth as butter.

    Also I mess around with some AI things like stable diffusion, and it’s much faster for that as well. As much as I hate the term “future proof”, the 4090 is more worth it in that regards, IMO.

  • fiah@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    upgrade your monitor

    of all the possible choices out there, the 3090 is pretty ass but the 4090 is actually one of the best. But not on 1080p, you ought to have ditched that back when you got that 2080

    • parmesancrabs
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Even if its “just” to get a notably higher refresh rate. If you’re considering around 4090 kind of prices a lovely higher refresh rate 1440p monitor would be a great sweet spot to consider.

      Though I’d maybe say different if its business expense to earn you revenue and gaming is only lighter touch.

      • fiah@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        a lovely higher refresh rate 1440p monitor would be a great sweet spot to consider.

        nah, screw that, a 4090 is wasted on anything not 4K in my opinion

        • MDKAOD
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          Sincerely, why? I ran 1440 165hz with my 3070 with mediocre success and even with my 4090 and games like jedi fallen order and valhalla hitch and stutter occasionally. Paired with a 7900x I was expecting better and feel like a lot of people get hung up on the numbers.

          • fiah@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I ran 1440 165hz … even with my 4090 and games like jedi fallen order and valhalla hitch and stutter occasionally

            could’ve been playing at 4K with those same stutters instead as those stutters were definitely not caused by lack of GPU rendering power

    • Awwab@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Monitor upgrades are one of the big reasons why I’m still rocking a 1070ti. If I buy an expensive new video card then I also need to upgrade my ultra wide and I’m not ready for that yet.