I’ve got an 8086K and 3080, running on a 4K screen - with Ultra settings and FSR of 80% I’m getting 35-40fps, which honestly doesn’t feel too bad. It’s noticable sometimes but it’s a lot smoother than the numbers suggest.
Because my CPU is a little long in the tooth, I’ve gone probably a bit hard on the visuals, but my framerate didn’t improve much by lowering it. The engine itself has never really liked going past 60fps, so I don’t know why people expected to be able to run 100+ frames at 4k or something.
Easy 1440p60 on ultra everything with no scaling on my 3090. Frequently up in the 80-90 FPS range. This game runs fine. It’s not a “teetering mess” as you say.
Completed some testing on my end, using intels PresentMon and sitting at 35fps average in New Atlantis my GPU busy is pegged at about 99% of the frame time, so nothing really.
I do get a bit of a CPU limitation when it’s raining, but nothing significant, dropped to about 30fps.
Trying at 1440p with the same settings as the 3090 above got me around 50fps, 1440p is almost half the pixels of 80% of 4k as well, so that’s not helping my GPU much!
I’d really not expect the performance difference between a 3090 and a 3080 to be that large, and the only difference I can think of in our systems is the CPU. (5800X3D vs 8086k)
New Atlantis is a smooth 60+ fps with every setting maxed out at 1440p.
Considering that CPU is less powerful than what’s in the Xbox Series S, which does 1080p30, I’m not at all surprised they’re getting a similar frame rate.
If this was a “teetering mess” you would have heard it in the Gamers Nexus benchmarks. Steve says nothing to this end, and the game benches predictably across the spectrum of hardware tested.
I think there wil be patches and some updates to NVIDIAs shitty driver that will fix things in the future :)
Otherweise yeah maybe get an AMD GPU next time, don’t fall for the NVIDIA Marketing. Using Radeons since the 9800 pro Bundle with Half Life 2 and never had any issues with them or their drivers.
Hopefully. I’ve always been more of an AMD/ATI fan, but for this laptop the deal worked out to be better with an Nvidia card. But next time I’m not settling for it. AMD CPU and GPU is the way to go. Especially because I’m trying to daily on Linux now and the driver side is much much nicer with AMD.
You had me in the first sentence, and then I realized it was sarcasm. 🤪 I’m running a similar rig, but it’s primarily for rendering work, etc., so for juuust a second there, I wondered if it was falling behind. 😅🤓
I have a 3080 ti, and a 12700k, and 32 gigs of ddr5, and a 2 terabyte ssd. It runs great for me. I don’t understand the problem. /s
So, this system runs it fine? Good to know. I was worried that my computer would not be able to run it smoothly, but now no worries at all.
I’ve got an 8086K and 3080, running on a 4K screen - with Ultra settings and FSR of 80% I’m getting 35-40fps, which honestly doesn’t feel too bad. It’s noticable sometimes but it’s a lot smoother than the numbers suggest.
Because my CPU is a little long in the tooth, I’ve gone probably a bit hard on the visuals, but my framerate didn’t improve much by lowering it. The engine itself has never really liked going past 60fps, so I don’t know why people expected to be able to run 100+ frames at 4k or something.
Sorry mate but 35fps on a 3080 with FSR is just objectively bad performance.
Starfield is not doing anything in terms of graphics or gameplay that other games that run 3-4 times as well aren’t doing.
That’s because they’re CPU limited, mate.
Easy 1440p60 on ultra everything with no scaling on my 3090. Frequently up in the 80-90 FPS range. This game runs fine. It’s not a “teetering mess” as you say.
What exactly is it doing that an 8086k is CPU limited to 35fps?
Completed some testing on my end, using intels PresentMon and sitting at 35fps average in New Atlantis my GPU busy is pegged at about 99% of the frame time, so nothing really.
I do get a bit of a CPU limitation when it’s raining, but nothing significant, dropped to about 30fps.
Trying at 1440p with the same settings as the 3090 above got me around 50fps, 1440p is almost half the pixels of 80% of 4k as well, so that’s not helping my GPU much!
I’d really not expect the performance difference between a 3090 and a 3080 to be that large, and the only difference I can think of in our systems is the CPU. (5800X3D vs 8086k)
New Atlantis is a smooth 60+ fps with every setting maxed out at 1440p.
Considering that CPU is less powerful than what’s in the Xbox Series S, which does 1080p30, I’m not at all surprised they’re getting a similar frame rate.
If this was a “teetering mess” you would have heard it in the Gamers Nexus benchmarks. Steve says nothing to this end, and the game benches predictably across the spectrum of hardware tested.
for me the game runs pretty well with ~90+ fps on high and activated fsr2.
5800x 3D, 64 gigs of ram and a 6900XT I shot cheap during the great gpu collapse. And by the looks of the game this seems pretty reasonable to me.
AMD users are having a better time with it, unsurprisingly. I wish I hadn’t gone for Nvidia but too late for that.
I think there wil be patches and some updates to NVIDIAs shitty driver that will fix things in the future :) Otherweise yeah maybe get an AMD GPU next time, don’t fall for the NVIDIA Marketing. Using Radeons since the 9800 pro Bundle with Half Life 2 and never had any issues with them or their drivers.
Hopefully. I’ve always been more of an AMD/ATI fan, but for this laptop the deal worked out to be better with an Nvidia card. But next time I’m not settling for it. AMD CPU and GPU is the way to go. Especially because I’m trying to daily on Linux now and the driver side is much much nicer with AMD.
I’ve got that but with a 4080 - no issues.
I admittedly feel like I went full removed on my build and seriously hope these specs aren’t what’s necessary…
Hey at least you don’t have to upgrade for a while, could probably run it for another yr if todd is generous
You had me in the first sentence, and then I realized it was sarcasm. 🤪 I’m running a similar rig, but it’s primarily for rendering work, etc., so for juuust a second there, I wondered if it was falling behind. 😅🤓