How did we get here?

  • Contramuffin@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    5 months ago

    I’ll be completely honest, that’s probably the coldest take someone can make about recent tech that I’ve seen, and it’s being presented as a hot take.

    Virtually everyone prefers native, almost aggressively so. That being said, I think there’s important nuance that’s missing in most talks about upscaling. In my testing, my experience of blurring and smearing with upscaling/frame gen seems to be hugely dependent on pixel density. If you get a really dense screen, then upscaling, in my experience at least, becomes virtually undetectable even at 1080p.

    • HSR@lemmy.dbzer0.com🏴‍☠️@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      5 months ago

      probably the coldest take someone can make about recent tech that I’ve seen, and it’s being presented as a hot take

      That’s exactly what the “we have you surrounded” meme template conveys (at least according to my understanding): a popular opinion, but ironically presented as a fringe opinion.

      So no, this isn’t really intended this as a “hot take”, there seems to be a decent amount of people who dislike TAA for example. I’m pointing out a trend in the industry, that devs are using temporal or upscaling tools to make the game run/look better, and GPU vendors support those tools to squeeze out the most fps from their cards. At this point TAA is the standard AA method and is integral to how some games are rendered, and upscaling is advertised as basically free* performance. Unfortunately, by its nature, all this temporal tech doesn’t work too well at low framerates and resolutions, a scenario where it would be very useful.

      I would agree that most artifacts and the softening effects of upscaling will be less visible on higher density screens, or when you’re sitting further away from a screen. Unless your TAA/upscaling implementation is absolutely botched, in which case it will always looks garbage, but that’s not really the fault of a specific technology.

  • moody@lemmings.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    5 months ago

    I don’t play a lot of AAA games, but ngl I’m quite happy gaming at 1080p on my 27" monitors.

    I get way better framerates, and it still looks plenty good with maxed out graphics, as long as I’m not sticking my face right up against the screen.

    • ichbinjasokreativ@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      5 months ago

      Have you actually tried 4k though? Yes, framerates are lower, but boy does it look better. For me, and it’s that’s just my take on it, 1080p ends at 24" monitors.

    • Crozekiel@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      Same. Honestly my eyes aren’t good enough to notice a difference between 1080p and 1440p (or 4k) at the scale of my pc monitor, but I damn sure notice a difference between 60 fps and 200 fps…

    • HSR@lemmy.dbzer0.com🏴‍☠️@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Glad it works for you. Since I upgraded to a 1440p monitor (I still have the same GPU) I went from comfortable high-ultra settings to mid-high settings + FSR Quality in more demanding titles. From the games I played in both 1080p and 1440p, I’d say that less GPU-intensive titles definitely look better in high-res, but I found the overall experience quite whelming.

      Simply playing on a higher res monitor won’t necessarily give you better visuals if you don’t have the GPU power to match settings, however at that point it’s not “higher res = better visuals” but “more powerful PC = better visuals” which, duh, of course it will look better.

      • R0cket_M00se@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 months ago

        Well there’s your problem, you wanted better resolution but didn’t match it with a GPU upgrade.

        Gotta have both or you’ll suffer a bit of loss.

      • moody@lemmings.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        That’s basically the feeling I get. I’ve been gaming on PC since the days of CRT monitors that could run many different resolutions. The tradeoff was always quality vs resolution vs framerate, but nowadays LCD/LED monitors have a fixed native resolution, so that’s one factor to take out of the equation. Nobody wants to play games at a non-native resolution.

  • pythonoob@programming.dev
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    5 months ago

    I still play in 1080. I believe monitors shouldn’t cost as much as my build. (Yes I’m exaggerating for comedic effect).

  • wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 months ago

    Friendly tip: For singleplayer games, you can always disable the game’s built in AA solution and use reshade for AA instead. If you have extra GPU power you can also use reshade to add all sorts of other graphical effects if you’re willing to fiddle around with things to get it looking good.

    If you have an NVidia card, sometimes PCGamingWiki has instructions for tweaks you can do in Profile Inspector to adjust how the driver applies AA to a game too.

  • Glifted@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 months ago

    I’m still running a 1060. You’d be surprised what you can play if you’re willing to put up with shit graphics

  • Dr. Wesker@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    4
    ·
    5 months ago

    Solution is to not play modern AAA garbage. For the $40-50 pricetag, I could get a handful of great indie games off my wishlist. Games that won’t bat an eye at an aging GPU.

    • PeterPoopshit@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 months ago

      They would make a lot more money if they made games run on older hardware. Most people can’t play cities skylines 2. Most people can’t play kerbal space program 2. We don’t want photorealistic graphics. Just give us fallout 3 era graphics because thats good enough. Fuck.

      • R0cket_M00se@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        5 months ago

        I agree in theory but Fallout 3 is a horrible example, the art direction was the epitome of the era’s fascination with bland brown and sickening green landscapes.

        The answer here is to ask if photorealism matters to the game or not, if another type of art direction suits better then do that. Hell, look at Boltgun or even just games that used contrast and bold colors to their advantage like Halo 3 or Mass Effect 2.

  • LouNeko@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 months ago

    Remember when you could force MSAA through Nvidia Control Panel on almost any game without issue? Pepperidge Farm Remembers.

  • ferret@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    4
    ·
    5 months ago

    The completely unfounded death of MSAA in modern games is devastating. It was (and still is!!!) so much better than every alternative.

      • umbrella
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        5 months ago

        Still looks better than all alternatives by far though.

        edit: second best, just remembered DLAA is a thing whenever its actually implemented.

      • ferret@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 months ago

        Games like Deep Rock Galactic have every reason to use MSAA but don’t anyway because game engines decided it was unnecessary, and small devs like that can’t be arsed to maintain their own implementation.

        Also textures and normal maps don’t need anti-aliasing because they will already have it baked in. Shaders are a similar situation where any aliasing will be situational and should be handled by the shader itself. (If it even makes sense to do so)

  • Coreidan@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    5 months ago

    Sounds like you care more about eye candy than gameplay. To each their own.

  • Tattorack@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    5 months ago

    Most AAA games have been so shitty lately that calling something “AAA” these days is almost like saying a bad word.

    Gods, the amount of disappointments I’m glad I didn’t waste money on. The biggest spending I’ve done on gaming lately is buying myself a Steam Deck. Now I’m enjoying my backlog of indies I got from Humble Monthly.

  • Venator@lemmy.nz
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    5 months ago

    DLAA with frame generation seems pretty good to me in cyberpunk.

  • umbrella
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    5 months ago

    I’ve replayed rdr2 recently and TAA absolutely DESTROYS the beautiful visuals of the game.

    That said stuff like DLSS is a godsend and looks 90% there depending on the situation. Its simply another tradeoff you can make.

    My old nvidia can punch above its weight because of it.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    This is especially apparent and a major downgrade on war thunder because you need to be able to see player tanks and aircraft in the distance which ends up being like 3 pixels on your screen which DLSS will make invisible lol.

    There are definitely some games that benefit from this, but the inconsistency is still enough that its not really useful for FPS and Multiplayer games where even the slightest change on your screen can dramatically affect your gameplay. A sharp 60fps is much preferable to an AI generated 120 fps which may remove detail and accuracy.