TL;DW:

  • FSR 3 is frame generation, similar to DLSS 3. It can greatly increase FPS to 2-3x.

  • FSR 3 can run on any GPU, including consoles. They made a point about how it would be dumb to limit it to only the newest generation of cards.

  • Every DX11 & DX12 game can take advantage of this tech via HYPR-RX, which is AMD’s software for boosting frames and decreasing latency.

  • Games will start using it by early fall, public launch will be by Q1 2024

It’s left to be seen how good or noticeable FSR3 will be, but if it actually runs well I think we can expect tons of games (especially on console) to make use of it.

  • Dudewitbow
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I’m not saying reflex is bad and not used by esports pros. Its just the use of theoretical is not the best choice of word for the situation, as it does make a change, its just much harder to detect, similar to the difference between similar but not the same framerate on latency, or the experience of having refresh rates that are close to each other, especially on the high end as you stop getting into the realm of framerate input properties, but become bottlenecked by acreen characteristics (why oleds are better than traditional ips, but can be beat by high refresh rate ips/tn with BFI)

    Regardless, the point is less on the tech, but the idea that AMD doesnt innovate. It does, but it takes longer for people to see t because they either choose not to use a specific feature, or are completely unaware of it, either because they dont use AMD, or they have a fixed channel on where they get their news.

    Lets not forget over a decade ago, AMDs mantle was what brought Vulkan/DX12 performance to pc.

      • Dudewitbow
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        Because AMD gpu division is a much smaller division in an overall larger company. They physically cant push out as much features because of that. When they decide to make a drastic change to its hardware, its rarely seen till its considered old news. Take for example maxwell and pascal. You dont see a performance loss at the start because games would be designed for hardware at the time, in particular whatevers the most popular.

        Maxwell and Pascal had a notible trait allowing it to have lower power consumption, the lack of a hardware scheduler as Nvidia moved the scheduler onto the driver. This allowed Nvidia to manually have more control of the gpu pipeline allowing for their gpus to handle smaller pipelines better, compared to AMD which had a hardware based one with multuple pipelines that needed an application to use properly to maximize its performance. It led to Maxwell/Pascal cards to have better performance… Til it didnt, as devs started to thead games better, and what used to be a good change for power consumption evolved into a cpu overhead problem (something Nvidia still has to this day reletive to AMS). AMDs innovations tend to be more on the hardware side of things which is pretty hard to market because of it.

        It was like AMDs marketing for Smart Access Memory (again a feature AMD got to first, and till this day, works slightly better on AMD systems than other ones). It was a feature that was hard to market because there isnt much of a wow factor to them, but is an innovation.

          • Dudewitbow
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            10 months ago

            Which then comes with the question of price/perf. Its not that its a bad idea that DLSS is better than FSR, but when you factor in price, some price tiers start to get funny, especially in the low end.

            For the LONGEST time, the RX 6600, which by default, was about 15% faster than the 3050, amd was significantly cheaper, still was outsold by the 3050. Using DLSS to cover the performance of another GPU does natively (meaning objectively better, no artifacts, no added latency) is when that argument of never buying a gpu without DLSS becomes weak, as the issue for some price brackets is what you could get at the same price or similar might be significantly better.

            In terms of modern gpus, the 4060ti is the one card everyone for the most part, should avoid (unless your a business china that needs gpus for AI due to the U.S government limiting chip sales)

            Sort of the same idea im RT performance too. Some people make it like AMD cant RT at all. Usually their performamce is a gen behind, so in situations like the 7900 xtx vs the 4080, could swing towards the 4080 for value, butnfor situations like the 7900xt, which was at some point, being sold for 700$, ots value, RT included was significantly better than the 4070ti as an overall package.

              • Dudewitbow
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                10 months ago

                Which is what.im.sayong, the condition of course that the gpus are priced close enough (e.g 4060 vs 7600). But when theres a deficiency in a cards spec (e.g 8gb gpus) or a large discrepancy in price, it would favor the AMD usually .

                Its why the 3050 was a terribly priced gpu for the longest time, and currently, the 4060ti is the butt of the joke, and someone shouldnt use those over the AMD in the said price range due to both performamce, and hardware deficiency(vram in the case of the cheaper 4060ti)

                  • Dudewitbow
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    arrow-down
                    1
                    ·
                    10 months ago

                    In the case of the 4060ti 8gb, turning on RT puts them past the 8gb threshold killing performance, hence hardware deficiency does matter in some cases.