• 8ender@lemmy.world
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    4
    ·
    1 year ago

    Why is it bizarre they clearly put all their effort into making it run on the Xbox and that’s AMD hardware.

    • ilickfrogs@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Because the author is bizarrely ignorant for something that’s published. Oh wait, it’s 2023. It’s about click through, not accuracy. My b.

      • WarmSoda@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        1 year ago

        Someone tried to argue that this game is as polished as Tears of the Kingdom lol

        • 520@kbin.social
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          Ahahahahahaha!

          The worst that game suffers from are duplication glitches

          • Jakeroxs@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            6
            ·
            1 year ago

            It’s not even a close comparison, Nintendo games look like ass because they have a max resolution of 1600x900 and 30fps, add in the texture resolution of things in game as well and it’s obvious why PC games often run “worse” also… They have one console they release on instead of the literally millions of possible different PC configurations

            Just a dumb comparison to any PC games

            • 520@kbin.social
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              1 year ago

              It’s not even a close comparison, Nintendo games look like ass because they have a max resolution of 1600x900 and 30fps, add in the texture resolution of things in game as well and it’s obvious why PC games often run “worse” also…

              Did you account for the fact that Nintendo was developing for massively underpowered handheld hardware? And not significantly more powerful Xbox Series consoles? And actually made their games to fit the strengths and limitations of their target hardware?

              They have one console they release on instead of the literally millions of possible different PC configurations.

              You would have a point…if Starfield ran with decent performance on even the Xbox Series X. You know, the target platform?

              • Jakeroxs@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                1 year ago

                It does run well, at 30fps like they specified, I’m not sure what performance issues being reported youre looking at

                • Sethayy@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  Most the original comments were about bugginess, which is just bad programming, hence the lack of polish

                • 520@kbin.social
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  edit-2
                  1 year ago

                  …the fact that it has to run at 30fps on powerful hardware despite having nothing to show for it?

                  To put it another way, how the fuck is it not targeting 60 on the Series X? I could understand it for the Series S, but there is little to no fidelity improvements on show like they said there’d be.

  • Muddybulldog@mylemmy.win
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    3
    ·
    1 year ago

    AMD being a “partner” is business speak for “AMD paid us a bunch of money because having their brand on our product is a much larger advertising reach than they can accomplish on their own”.

    That performance is better on AMD is in no way “bizarre”… it’s exactly what would be expected.

  • BigFig@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    1 year ago

    It is? Am I doing something wrong? Because I get a solid 60-70 fps at all times on a 3070ti

    • MrMusAddict@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      From what I recall from one of their Directs, Digital Foundry corroborated another outlet’s finding that ultra settings (and I think specifically ultra shadows) are unoptimized. Tons of weird frame time jittering, and like a 15% drop in FPS compared to AMD. So, if you have shadows turned to High or lower, that’ll explain it. Otherwise, what they’re saying is an AMD equivalent would be getting 70-80 fps in your case.

      • DarkThoughts@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        My 6650 XT isn’t quite the same but I can’t even get those framerates with low to medium settings. lol
        It’s more like 40-75 FPS depending on the location.

      • Ethanice@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Not original dude but
        I’m using an i7 11700k and a 1660 super and getting 60fps with occasional drops to 50.

      • sverit@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        My old i7-9700k is on ~50% on all 8 cores, my 3080Ti is on 100% and I get about 70-90fps.

    • Virkkunen@kbin.social
      link
      fedilink
      arrow-up
      6
      arrow-down
      5
      ·
      1 year ago

      Noo you’re not allowed to enjoy the game, the internet people said that Bethesda are terrible developers and the game runs like shit

    • Dudewitbow
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      The game runs well on midnrange pcs, but it scales terribly as you move up the stack. It has the problem crysis had.

    • zurohki@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Not surprising, game devs aren’t really testing with Intel GPUs or working with Intel driver devs.

      There will probably be a driver fix in a week or two though. Intel seem to be progressing fast.

  • Halosheep@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    If anyone wasn’t aware, there is a mod to replace FSR2 with DLSS and it is INCREDIBLE for performance if your system supports it. I went from all minimum with 40-50fps to well over my 144 target on medium (indoors) and running okay (60 with some drops) in tough scenes outdoors.

    Running on a 2070.

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I tried that and saw no difference unfortunately.

      I’m just confused because I have an i9 9900k and a 4070ti, neither are even remotely close to being worked hard and yet I can’t break 50fps. Like half vram usage and maybe 10% CPU/GPU usage. I thought I had it running smoothly, but it’s a smooth 30fps…

        • Asafum@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Lol yeah it’s all hooked up correctly. Just don’t get why it seems to not be utilizing my system efficiently. :/

          • Hadriscus@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I have a 4070ti as well and it struggles a bit in NA. Probably under twenty FPS in the most crowded places

          • SwampYankee@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Finally took the time to check my performance. I’ve got a 3070ti and a Ryzen 9 3900x, CPU is at ~55% utilization and GPU is ~95% at maybe 40 fps average. That’s at ultra on all settings, with the DLSS mod and a render scale of 80%.

            Have the recent driver & game updates helped at all?

            • Asafum@feddit.nl
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I’m an idiot and for the entire year I’ve had my 4k monitor I never enabled the setting on the monitor that allowed it to run at 60hz so I was always locked at 30fps with vsync…

              I couldn’t say if the recent update helped as I just discovered this new level of my idiocy yesterday lol

  • Send_me_nude_girls@feddit.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    You can force it to use resizable bar and get more fps. It just needs to be enabled and it’s such an easy thing for the Bethesda devs to do, yet people need Nvidia profile inspector to enable it. For no reason.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      You mean to tell me that enabling ReBAR in the BIOS doesn’t automatically enable it for every game?

      • MooseBoys@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        The BIOS setting enables the bus feature in hardware. But the driver also needs to support it.

    • MooseBoys@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      force it to use resizable bar and get more fps

      If this is true, it means the game is designed around a UMA architecture, i.e. xbox. Nobody in their right mind tries to map more than 256MB of CPU memory concurrently for a single frame. Either that or the engine is completely shit at resource streaming (also characteristic of console-first games), and so is relying on the OS to demand-page random resources as needed.

  • Bye@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    1 year ago

    Kept getting “your gpu is too old” error messages, although I know others have played it online with 980ti. All the google results said to update windows but I wasn’t even on windows. Gave up on it.

    • Alto@kbin.social
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      In my mind the 980ti is still only a couple years old, so I went to check.

      1. Eight years. Fuck me, I’m still not used to this starting to feel old thing.
      • Bye@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Yeah but they say you can run it on a 1070 and that’s basically the same lvl as 980ti

        Anyways I saw 980ti benchmarks out there so i don’t think that’s the problem, it’s some dx12 compatibility nonsense. Which shouldn’t be an issue because it supposedly runs on steam deck.

      • Bye@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Thank you so much for confirming it’s possible.

        Ok maybe I will do a Wendy’s dual boot so I can try to play it. I love oblivion so much and I just want to play TES in space

    • Skyhighatrist@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I get the same error on the same hardware. Luckily, I have a steam deck I can play it on while the issue (hopefully) gets resolved in Proton. It doesn’t run very well on the steam deck, though. But it is playable.

    • Ethanice@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I Uninstaller graphics drivers on my boyfriends pc and reinstalled them manually to get that fixed. GeForce experience install wasn’t working.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Jokes aside, the game runs great with the DLSS 3 FG mod installed.

      What’s stupid is that a 4090 gets similar FPS as a 7900XTX in this game without the mod. That’s just plain sabotage.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    Starfield is one of the most demanding games on PC that we’ve seen recently, with even the RTX 4090 paired with AMD’s latest Ryzen 7800X3D just about hitting 60fps on average at 4K with all the settings maxed out.

    As reviewers and testers scramble to figure out why Starfield is so heavy, the experts over at Digital Foundry have discovered some obvious differences between AMD and Intel / Nvidia systems.

    “If you’re on Intel and Nvidia you’re getting a bizarrely worse experience here in comparison to AMD GPUs in a way that’s completely out of the norm,” explains Alexander Battaglia in a detailed 32-minute tech analysis of Starfield on PC.

    In my testing, I’ve found the RX 6800 XT can beat the RTX 3080 in a variety of games, but 46 percent is a far bigger margin than normal.

    Starfield director Todd Howard was asked why Bethesda hadn’t optimized the game for PCs during a Bloomberg interview last week.

    That answer hasn’t satisfied the many who are wondering why Starfield doesn’t play as well on their Nvidia and Intel systems, which account for the vast majority of PC gamers in Steam’s hardware survey.


    The original article contains 630 words, the summary contains 193 words. Saved 69%. I’m a bot and I’m open source!

  • Bleeping Lobster@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    12
    ·
    edit-2
    1 year ago

    It’s not bizarre, as others have pointed out AMD has clearly had a hand in making sure this performs better on their GPUs. One instance could be a coincidence but when you’ve got multiple instances of things being ‘missing’, ‘not optimised properly’ etc for RTX cards, you have to wonder whether it’s a bunch of coincidences or deliberate.

    This has taken a lot of shine off AMD for me. They seem to be employing a Russia-esque strategy of “If I can’t improve myself then I guess I’d better make things suck for other people so I don’t seem as bad”

    • Hyperreality@kbin.social
      link
      fedilink
      arrow-up
      15
      ·
      1 year ago

      A graphics card company ensuring software performs better on their GPU?

      Time to switch to nvidia. They would never do such a thing.

      LOL

    • papalonian@kbin.social
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      1 year ago

      Feel like this isn’t the best take. AMD working with Bethesda to make sure the game works on their card doesn’t come close to implying they made sure it didn’t work on Nvidia cards. Nvidia should’ve been working to make sure the game ran well on their cards too.

      Nvidia has been pulling the tricks you’re talking about for years now, though

    • BananaTrifleViolin@kbin.social
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      Microsoft owns Bethesda. Microsoft owns Xbox.

      Xbox uses AMD GPUs and CPUs.

      So the game being optimised for AMD makes absolute sense for Microsoft.

      AMD paying for access to optimise for thier PC CPUs and GPUs makes sense for AMD.

      However not optimising the game for Intel and Nvidia does not make sense for Microsoft. This is more likely to be an oversight/typical poor AAA game launch than deliberate play to benefit AMD. Other games like Cyberpunk 2077 for example had problems on CPUs/GPUs, we have selection biase here where there are fewer problems on AMD systems, and also a generally reasonably solid launch.

      Its frustrating but most of the issues are optimisation, not game breaking. The experience on Intel/Nvidia systems is good, just not as good as it could be. One of the examples in the article was a framerate of 95 FPS vs 105 FPS - that may have been avoidable, but it’s a minor annoyance at best. Some of this (not all but some) is just obsessing over minutia and things that won’t affect the player experience.

      So basically storm in a tea cup, and much of this the usual post launch technical difficulties that will be optimised with patches. This is why people shouldn’t buy games at launch, although so far at least we haven’t seen the game breaking bugs that have dogged other AAA titles at launch.

    • Nefyedardu@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      AMD has clearly had a hand in making sure this performs better on their GPUs

      NVIDIA’s entire business model is brand-exclusive proprietary software. Last I checked you can use FSR on NVIDIA but you can’t use DLSS on AMD.

      • JohnEdwa@kbin.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        DLSS doesn’t run on older nVidia hardware either as it’s designed to utilize the raytracing and tensor cores of the RTX series. I recall reading somewhere that while it could technically be made to run without them, without the specific cores optimized to do the calculations required it would run terribly. Then again it might just be a blatant lie ¯\_(ツ)_/¯
        FSR on the other hand is designed to run on standard GPU hardware and seeing as the tech is open source they can’t exactly hide any code that would break compatibility with nVidia.

      • ShadowRam@kbin.social
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        brand-exclusive proprietary software

        But to be fair, nVidia has also been pumping massive amounts of $$$ into R&D in both the Graphics and AI space.
        They need return on their R&D investment somehow.

        And it’s not like they are cutting AMD out of the AI enhanced stuff.
        They just aren’t going to spend $$$ and effort to help AMD implement their solutions, and AMD doesn’t have the hardware to run the AI functions properly.

        AMD can implement RTX if they wanted to, nVidia’s research papers are out there.
        But they can’t because they don’t have the knowledge of how to implement it.

        And it isn’t like AMD is sharing with Intel any of their R&D work they do on the CPU side.

    • Neato@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 year ago

      That’s a pretty bold conspiracy theory. Nvidia outsells AMD by a pretty huge margin. As does Intel in CPUs. What would get Bethesda to deliberately favor AMD tech and hobble Nvidia? That would merely give them a LOT of negative press, as we are seeing now.

      The idea of bribery is right out because Bethesda is owned by MS. The idea of laziness is also not great because as above, there’s more Intel/Nvidia users so it’d be easier to only prioritize one set of hardware, the most common, if laziness was the goal.

      Most likely it’s as someone below said: this game was primarily designed around console performance. Both of which, the Xbox and PS5, use AMD hardware. And Bethesda is either too inept or too time-constricted to get it to run well on the primary PC hardware. This is, pretty damn common in the games industry: allowing PC performance to flounder because they are a smaller set of sales.

      But MS has no incentive for Nvidia cards to not work well because 99% of PC users are Windows users and most likely run this on Gamepass, an MS product.

      • JohnEdwa@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Also because you can always lower settings or throw better hardware at the problem on PC so even a badly optimized port should eventually run acceptably. But if you fuck it up on console, you get Cyberpunk on the PS4 and have to spend a ridiculous amount of time and money to make it work.

        And this is Bethesda we are talking about, at this point I wouldn’t be surprised if the PC versions are designed from the get got under the expectation that the modding scene will come to the rescue and fix everything for them no matter how terrible their work is.

      • Veraxus@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        What would get Bethesda to deliberately favor AMD tech and hobble Nvidia?

        From whom does Microsoft source the CPUs and GPUs for every single XBox?

        Yep.

        • Neato@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Same people Sony does. It isn’t about Nvidia. It’s about lazy developers not optimizing for PC.

    • bastion@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Nvidia does this all the time. If anything, I’d like to see more games focused on AMD graphics, it’s a tactic that AMD as a company has been getting whomped by.

      Ideally, though, there would be better support for all graphics platforms, though.

      I don’t think it’s deliberate, per se, but Bethesda was clear about who they were partnering with, long ago. If a graphics company is trying to help you optimize for their platform, they aren’t going to be stressing abouy the impact on their competitors.