If 90% of gamers can’t play a game, is it still worth releasing it like that?
I doubt 90%of players run the newest games at 4K/high
It’s not the resolution:
Even with AMD FSR 1.0 at 50% resolution scale, the game cannot come close to 30fps.
Dude if someone is spending 1.8k on just a fucking CPU and GPU together (this doesn’t include the cost of the motherboard, ram, storage, case, monitor, or mouse) I would fucking hope I can run my new game release at fucking 60fps 4k (minimum) natively.
Game dev companies got lazy. Instead of DLSS and FSR being really great tools for older GPUs to run newer games, it became a crutch for brand new $900 GPUs to run newer games.
Don’t get me wrong, DLSS and FSR are awesome and I use them to get games to run well at 4K with my 3070 Ti, it’s just a shame so many devs are abusing it.
I think its a bit unfair to say they got lazy. They just shifted their development to lower the priority on optimization since even though corporate Game development sucks I don’t think I’ve seen many “lazy” game devs. Many of them work pretty hard jobs for shit pay at least compared to other programming fields (Rough crunch periods, most of their audience hates them, etc)
Absoluteley, any lazy gamedev would just quit, get a boring SWE job and work fewer hours for twice the pay.
No. This is a city builder, not cyberpunk. This is a cpu optimization problem, not a gpu one. Also, a lazy development team doesn’t add enough features for it to suffer from such major optimization problems.
Oh shit, I actually missed that last part of the headline. Mea culpa.
I just built a 7800x3d RTX 4090 build so I’d expect to hit 4k 60fps but I’m more a 1440p 240hz guy. I guess I’ll settle for whatever I can’t get with this game lmao. At least it’s on game pass.
at 4K/High Settings
Do you believe 90% of gamers will be playing at 4K/High settings?
… on AMD’s most powerful GPU.
I mean… At the current state of the game, 0% of gamers will be playing at 4K/High settings.
I don’t know what “high” refers to in this instance, but in general I kinda wish every game had their very highest settings targeted to future hardware. Not by necessity of bad optimization, but simply because it feels stupid playing older games that cap render distances, LoDs, foliage amount crowd sizes, lights, shadow qualities etc to hardware limits that were set a decade or two ago.
Just make it obvious and don’t call it “Very High” or “Ultra”, but directly just “Next-Gen” or something in the settings and have it target like 720p 30fps on a 4090.
I think I’m pretty confident in saying most people aren’t interested in sub 60 FPS, especially if it’s at 1080p and looking the way it does (which is mostly flat and unimpressive)
That’s the most shocking part, the high-end hardware needed to brute force a 1080p game at acceptable framerates
Eh, I’m fine with it in this style of game. A shooter I will not. BG3 I accepted running around 30 and didn’t even feel it. It’s not a twitchy game. It’s a top down city builder. As long as it’s responsive, it doesn’t really need to run at 60. It’s probably the ideal game to target 30.
BG3 runs at stable 1440p100fps+ for me on a 4070Ti without DLSS. I only enabled DLSS Quality and then capped framerate at 90fps because I didn’t really feel like the power consumption was worth it.
I’m almost in Act 3, and so far it’s been unproblematic… This game is on a totally different level.
Edit: every setting maxed out in BG3
For you. On a 40 class GPU. On a game that isn’t CPU bound.
Act 3 performs worse. Anyway, everyone has a different system. My point is different games have different acceptable framerates first person games need to be at least 60, most top down games can be lower and you won’t even really notice.
most people aren’t interested in sub 60 FPS, especially if it’s at 1080p
Hate to say it but this is a city building sim. Above 60fps would be amazing, but Cities Skylines 1 was already known for being… not great for frame pacing or frame rates.
Obviously more is better, but you can look at any similar game and get fairly understanding “oh only 37 FPS in CS1/CIV6/Rise of Industry/Urbek City Builder/Satisfactory/Dyson Sphere Program, that’s pretty solid.” The only (similar-ish) game I can think of that actually has never had bad performance is “Per Aspera”, but every single other one mentioned, I have had performance “desires/issues.” I could also throw rimworld and dwarf fortress in there but those are different enough to be questionably relevant, but those too have performance problems at different points in time.
That being said, it does not sound like the Devs intentionally hid this info, the content creators did mention early on there were performance issues and that Paradox was aiming to have them resolved. If there was any intentional hiding, it would be probably from Paradox as the publisher, but they seem to be relatively open this time around in regards to information.
TLDR: Low fps in genre ain’t that surprising, most are used to it. Obviously more is better, but they seem to be at least intent on addressing it, unlike some other devs.
By “people” you mean “the kind of wankers that fellate Gamers Nexus.”
Cities:Skylines has always had a frame rate that takes the strugglebus to work.
That’s basically what Crysis was when it released, so yeah why not?
Because Crysis for its time was breaking barriers in terms of graphics and physics. City skylines 2 doesn’t even look that good (graphically). So it just comes down to poor optimization that will get fixed after half a year to a full year of patching. This isn’t a great look even though they said “But we said it will perform poorly”.
“PURR URPTURMIZURTION”
Or… here’s a fucking idea… it’s a CPU bound game and not GPU bound. FUCKING WOW, WHO WOULD HAVE THOUGHT that the simulation game may not be graphically amazing but will wreck the shit out of any CPU with its simulation routines?
Only everyone that’s ever played any sort of in-depth simulation, that’s who.
You would have a point if you couldn’t increase your FPS by 20 fps by disable clouds, volumetric fog, lower LOD to the bottom. Also wouldn’t the FPS get better with increasing the resolution since you are putting more work on the GPU instead of the CPU?
You don’t even have the game and you are shilling for it super hard for some weird ass reason.
I dont get why people are mad about this. I’m happy that games are coming out that destroy top setups today because that means they will be beautiful (hopefully that’s what they are with max settings) with future hardware.
They’re ugly looking now, that’s the issue. skylines 2 definitely is an improvement over 1, but it’s not an astronomical improvement (like the difference you’d notice with some franchises moving from unreal engine 4 to 5)
The amount of raw performance needed to power this game is what’s shocking. It’s just a lack of optimisation.
Do you even fucking know what that word “optimization” means, or do you just throw it around because Jim Sterling told you to?
The issue is when the game is destroying top setups because its poorly optimized and full of bugs, and I dont think it was their idea to do a game for the future hardware because that would not be comercial viable.
You clearly never played CS1 on release.
That’s why you don’t run on high settings you fucking genius.
Its a single player game, who gives a shit how someone run it. If someone is spending 1.8k on just 2 parts I think its fair to hope a game will run “well” like this is abysmal.
Welcome to PC gaming. New here?
Nope, its pretty rare for games to release in such a horrid state where even top of the line stuff isn’t able power through it. Typically its the midrange/low end cards that are stuck with horrid frames and rely heavily on DLSS/FSR (even though that is annoying). The meme of “Can it run Crysis” shows how rare it is for a game at its highest be literally unplayable with modern hardware.
You don’t pay much attention do you?
Here’s an eight year old thread where a bunch of REEL GAMERZ MAN whinge about the same shit as now.
https://forums.tomshardware.com/threads/what-is-happening-with-recent-aaa-games.2866464/“Can it run Crysis” stuck around because Crysis was built for hardware that didn’t even exist yet, but was scalable enough to be played on then-current gen. It wasn’t because poor performance was rare.
Go to literally any major release’s Steam review page and you’ll see review after review shitting on the performance. Almost like there’s no way to fix shit you don’t know is broken.
Again my literal point was typically poor performance is to the mid range and low range rigs. This is a literal new release that as the fucking title of the fucking post says is
“Cities Skylines 2 reportedly runs with 7-12fps on an Intel Core i9 13900KS with AMD Radeon RX 7900XTX at 4K/High Settings”
7-12 FPS on top of the line gear is fucking stupid. Even Jedi Survivor wasn’t this bad and that was also a game that had “poor” performance for top tier gear.
Lower resolutions and graphical settings exist.
You have no point.
Shut the fuck up.
Really disappointed that after a solid 3-4 months of dev diaries, open communication and hype for the game, they drop this performance bombshell on us at the last moment.
They get points for at least giving everyone a weeks notice, but that’s clearly a calculated move (compared to if they kept it quiet entirely and it launched with people unaware)
I’m not instead on playing sub 60 FPS games at 1080p, especially not when I’ve got a 4090+13900k and it crushes almost every other game in existence. The game isn’t pretty enough to justify such terrible performance, it’s just purely unoptimized now.
Why there’s no DLSS / FSR also at launch is baffling, it helps GPU bottlenecked necked games greatly (even if boosting from a native 30 to 60 is a bit yuck)
Really disappointed
Like you said, they aren’t trying to hide it. I’m sure they weren’t sure where performance would end up at launch though. They publicly said they aren’t satisfied with the performance and will be working to improve it though. This isn’t the end of it. It’s disappointing it doesn’t perform as well right now (for us and I’m sure even more for them), but they’ve earned some amount of trust. I’ll give them time to get things where they should be.
From the couple of creators I’ve seen paying it, they were aware of some performance issues for sure. I think they were just unaware at how severe the impact was (since content creators normally have expensive PC’s) and how quickly they’d be able to address it.
It never sounded like they were aiming for being super optimised at launch either, but it did seem like they were confident “most” would be able to play it prior to the announcement.
And having watched CityPlannerPlays performance video of it, it sounds like the article didn’t really play around with things to see what different settings’ impact was. Specifically regarding resolution, it was noted that anything above 1080p seems to be extremely poor in performance.
Why there’s no DLSS / FSR also at launch is baffling, it helps GPU bottlenecked necked games greatly (even if boosting from a native 30 to 60 is a bit yuck)
I believe I had heard something about them having issues with getting it running, because for some reason they included their own “render scale” option that runs like ass. You can, fortunately enough, very easily add DLSS to most games even if they don’t natively support it though. That’s most likely what I will be doing.
If I had to guess the reason they waited so long is cause they thought they could fix them before launch, but stuff probably came up that made them realize it’s not gonna be ready.
deleted by creator
This is such a silly argument. Sure I can make a game that has a fucking memory leak to “really put your PC to the limit” and render every single tri on a polygon no matter the distance you are looking from but that is just a stupid way of “pushing your pc to the limit”. Hell lets make a 30 billion tri model for a generic npc and populate a scene with many of them, that will surely push your pc to the limits. This is just a poorly together hackjob where they know they can just patch it post launch because fools will buy this shit. The devs are working hard on this game but optimization shouldn’t just be pushed off to the post launch era of a god damn game.
Just like Crysis was a poorly put together hackjob too, right? You know, like in that post where you used Crysis as an example as a game with poor performance on launch. And of course the CryEngine is known to be a buggy cooked piece of shit, right?
Oh, wait, no, it’s quite literally the opposite. Weird.
And yet everything you said have no relevance
A poorly optimized game will not put your pc to the limit. Instead it will bottleneck itself on stupid issues and inefficiencies. Your pc will actually not be utilised to its full potential. Make no mistake, this game isn’t slow because it’s gorgeous and advanced, it’s slow because graphically it’s poorly made.
Don’t worry. They’ll release optimization as a DLC.
It’s known performance will be poor, but if it was that bad the ton of YouTubers doing their preview coverage would have been reporting it.
No, there was a performance embargo for reviewers that wasn’t lifted until after the developers had made their statement a few days ago.
It was pretty funny seeing stuttery footage on 60fps YouTube videos without any acknowledgement from the player lol
And they thought that just ignoring such clear issue was a good approach to take? Wow that’s fucking scummy on both sides
There is a reason for them to not report on it. They were still working on the game (and they still are even). They don’t know how the performance will end up at release until it’s there. Reporting on it too early just misinforms people.
Well, surely if they’re playing it 2 weeks before it’s due to launch and it runs like garbage, they’d think “hmm, maybe this won’t be ready in time. I should probably tell people about it” rather than just being greedy and sweeping it under the rug. Also, you can be honest about issues you experience with the people watching your content. If it gets better before it’s released, you just make an update video stating you’ve seen an improvement over time. No need to hide it
I’ve watched a good bit of the game so far. I don’t think anyone hasn’t discussed performance. It’s not something being hidden, it just isn’t where it should be or where they want it to be, and they’ve been clear they’re going to continue working on it to get it where it should be. They just can’t hit that target for launch. Delaying it wouldn’t be great either because plenty of people will be able to run it, just not as well as it could be. That’s OK in my opinion.
I think they did well tbh. They of course hoped they would be able to resolve it before release, so it shouldn’t be the focus on reviewers.
At least reviewers made it clear that there still was something to be said about performance, but it would have to wait.
And Colossus then made a clear statement on performance themselves up front of the launch date.
They do. Look at city planer plays video. According to this, I can hope to get a bit more then 30fps at 1080p with medium details with my 4070 ti and 7600x. Beyond that, I‘ll get a slideshow. For most PC owners out there, the game will be unplayable in it‘s current state.
Most YouTubers have beefy rigs. Also, the preview build could have some kind of limitation which was never intended for the final release but which improves performance
Probably will trial it and then wait for sale. By the time it goes on sale, it should run better lol
You can pick up game pass (which skylines 2 will be coming out on the 24th)
It’s only a buck for 14 days (though that might be region specific) so I’m definitely going to pick it up and give it a go to see how bad the performance is
Unfortunately Gamepass doesn’t work on Linux. Well, I’ll just trial it for free then.
So, this is releasing on Playstation and Xbox aswell? How the fuck will they be able to run it -at all-, 480p and 30 fps on low?
Maybe it is the reason that console release was pushed to next year
deleted by creator
It will run at a glorious 60fph.
I mean, I doubt that concidering PS5 and Series X has roughly the performance of a RTX 2070, while Series S has roughly that of a GTX 1650. If that’s anywhere near the truth there’s something seriously wrong with the game design. That’s about 150% difference in performance compared to the hardware in this post.
Edit: yes I know I misread his joke, I addressed it further down.
He said fph not fps
I must admit that I didn’t do any research, my previous comment was meant as a joke.
Dammit, flew right over my head
I wonder what I did wrong on the delivery of the joke.
Linked article is nothing but Unreal Engine fanboy masturbation.
Those people are fucking weird.
Also a bizarre comparison. Cities 2 is a simulation game - they are very CPU intense games. The graphics are nice but it’s likely it’s problems with balancing the CPU demand and the graphics that is the problem, rather than the graphics themselves. Simulation bottlenecks will drop the FPS drastically, regardless of the graphics engine.
From what I’e seen of the game on Twitch, I think the performance issues aren’t game breaking. It seems the game runs fine if you reduce settings; while it’s far from ideal it looks playable.
But it will be damaging for the game. Mods won’t launch until after the game is launched, and that may be delayed further by time taken fixing the game post launch. For a game that suceeded in a very large part due to user content that may really harm the game’s success.
Linked article is nothing but Unreal Engine fanboy masturbation.
UE having better 3D performance than Unity isn’t really that much of a hot take. Unity got that much traction because of its really favorable licensing terms before the recent change.
Have you seriously not noticed how there’s this weirdo subset that feels the need to throat UE every chance they get?
Why does the author specifically mention UE and not any of the other engines on the market? Why not Source, which is renowned for being one of the most flexible and performant engines out there?
UE fanboyism is real and it’s fucking weird.
Have you seriously not noticed how there’s this weirdo subset that feels the need to throat UE every chance they get?
No, I didn’t but I’m also not diving into every tech subculture.
Why does the author specifically mention UE and not any of the other engines on the market?
My guess is because of its versatility and the rumor from a couple of months ago that Cities 2 was UE-based. As an open source proponent myself, I would like to see the CryEngine-derived Open 3D Engine (O3DE) or Godot to gain more traction but at least the latter is still lacking on some features other 3D engines have for ages – I seem to recall to read a few weeks ago that shader stuttering is still a thing even in the newest Godot release. I don’t think any shipped product is using O3DE, so using that would be a big gamble for a relatively small development studio.
Why not Source, which is renowned for being one of the most flexible and performant engines out there?
Source 1 is pretty outdated and Source 2 is used by Dota2 and two FPS games.
The worst thing is when people who have absolutely no clue about game development believe advertisements and hype and get really opinionated about stuff they know nothing about, such as engines, tech or techniques.
Also worth noting that Skylines 2 comes out on Xbox game pass day 1. You can usually pick up a trial for a fortnight, that’s a pretty perfect opportunity to try this on PC (to see how bad it runs for you)
That’s what I’ll be doing, trying it out and most likely skipping it for a few months while they polish it up
Did you just say a “fortnight”?!.. hold my beer…
GPUs have been general-purpose many-core processors for like fifteen years now. Please stop designing simulations that run exclusively on CPU… and especially stop tying your simulation speed to the goddamn renderer.
I’ve written a 60 Hz renderer for a game that’s allowed to chug while you glide smoothly through it, and I’ve written a 60 Hz physics engine for a game that gets fractional frames per second, and I’m just some schmuck. What is your excuse? What kind of NP-hard nightmare did you design for yourselves, instead of identifying bottlenecks and faking the hell out of them? SimCity could run on 8-bit microcomputers. You cannot possibly be struggling to reach an acceptable minimum complexity, using hardware that’s forty years newer and ten thousand times faster.
I’m pretty confident the game isn’t tied to framerate, and also the game is almost always GPU bottlenecked from what I’ve heard. From what I’ve watched of the game, it has a ton of compute shaders and other shader work. In particular, weather is apparently a large cause of framerate issues, particularly temperatures. That’s because (I’m betting) the game is computing temperatures on the GPU and using that to draw snow and other things on the terrain and also structures. I’m pretty sure they know what they’re doing. They just did too much, and now they need to try to optimize it.
They don’t work on a custom engine. So they don’t have engine programmers so they don’t know how or probably can’t (i don’t know how you would do that in Unity). That’s the price you pay for ease of development
If Unity is so dodgy that a team of professionals can’t figure out how to spread a workload over time, they should have written it off immediately. The nature of their game was not a surprise. They’re not naive in this genre - it’s a direct sequel.
Unity has its issues, especially lately, but the engine is perfectly capable of doing what they want to do. Most of the GPU load is likely just HLSL that’ll run pretty much (if not exactly) identically no matter what engine you’re using. UE would be a horrible choice for the game, especially 5. They could make their own engine, but no way they do that and don’t have to cut at least half of the features of the game.
The issue is just they did too much without enough time to optimize. They said they’re working on it and they aren’t happy with where it is. They’ve earned my trust, so I’ll take their word that it’s being worked on. Don’t just assume they’re telling the truth, but give it time.
They could make their own engine, but no way they do that and don’t have to cut at least half of the features of the game.
As opposed to now, where they also can’t handle the features of the game.
Unity being famously jam-packed with features relevant to a cutting-edge city simulator.
No kidding they’re going to make it better, over time, but there is no way this snuck up on them.
Am engine is just a set of tools stuck together. If they have to write their own rendered, editor, interpreter/compiler, and everything else, that’s a ton of investment that you then can’t spend on game features. You don’t do that unless you have very good reason. You are also required to maintain them yourself. You don’t just get upgrades essentially for free as the engine updates.
Unity actually does have a lot of features that are useful for a city builder, like ECS. I don’t know if you’re trying to be sarcastic.
The performance probably didn’t “sneak up on them” but they almost certainly didn’t know how it’d end up. There’s likely still a lot of optimization left in there and a lot of optional things that can be enabled/disabled. There’s no way to know how the end product is going to look until you’re nearing the end and all the pieces come together and time starts running out.
“This commercial engine cannot handle our game” is a pretty good reason.
However well ECS does its thing - it’s obviously not doing enough, for this game. Even if this somehow came on by degrees, and performance got just the tiniest bit worse day after day, the time to stop tolerating that and pursue performance was a year ago. That doesn’t rule out bugfixes. That doesn’t rule out new systems. That doesn’t rule out major changes.
But they’ve surely been fighting Unity for a long damn while and this is the best they could do.
“This commercial engine cannot handle our game” is a pretty good reason.
The “engine” isn’t at fault. You can continue to add things that consume resources and you eventually use up too many resources. That’ll happen on any engine.
However well ECS does its thing - it’s obviously not doing enough, for this game.
I believe you’re wrong here actually. By all accounts I’ve heard, it’s GPU bottlenecked even with increased entity counts over C:S1. It’s likely just too many shaders doing too much work too frequently. Weather and temperature are both apparently big hogs, which to me looks like the perfect opportunity for shaders to handle a lot of the work and I’m sure that’s what’s happening.