Currently trying to refund the new Indiana Jones game because it’s unplayable without raytracing . My card isn’t even old, it’s just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn’t look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.
I got a special fucking bone to pick with Cities Skylines 2. I’ve never had a game look like such vaseline-smeared ass while making my computer sound like it’s about to take off. It’s a shame because it’s definitely come a long way as a game and has some really nice buildings now, but to play it I start to get nervous after like half an hour and have to let my computer cool down, fuck that shit.
There’s a number of YouTube videos examining how CS2 was designed in such a shockingly bad way to murder your GPU
I want my games to be able to be rendered in software, I want them to be able to run on a potato from the early 2000s and late 90s, is this too much for a girl to ask for
Todd Howard made Morrowind run on 64MB of RAM in a cave. With a box of scraps.
The new indiana jones is actually pretty decently optimized, like I run it at 1080p all high/ultra settings on my rtx 3060 12gb, with DLAA downscaling enabled at a mostly locked 60fps. Like it is leagues better than any UE5 game, it’s just the hard VRAM requirements that suck.
I feel like a lot of the issues game graphics have nowadays is just that GPU prices have been ridiculously inflated over the last two decade because of crypto/ai. Like it is not surprising that devs will follow the newest trends and technologies when it comes to graphics, but the hardware needs of raytracing and global illumination and the likes are just too high for what gpu performance/dollar you can get in 2024. I just recently upgraded from an AMD RX480 to a used Nvidia RTX 3060 12GB (which seemed to be the best bang for the buck, an RTX 4060 would have been much more expensive for not a lot more performance), and that upgrade gets you maybe double performance in your games, for a GPU that is a whole seven years newer (and no VRAM upgrade at all when you get the base model). These cards just simply shouldn’t cost as much as they do. If you don’t have unlimited money to spend, you are going to have a much worse experience today compared to half a decade or a decade ago.
I just want ps2-level graphics with good art direction (and better hair, we can keep the nice hair) and interesting gameplay and stories. Art direction counts for so much more than graphics when it comes to visuals anyway. There are Playstation 1 games with good art direction that imo are nicer to look at than some “graphically superior” games.
What hair in modern games looks like
when u see a Homo Sapiens for the first time
that one is atrocious but another thing i also find nasty is the amount of disk space new games need. sure, buying more disks is way cheaper than getting more graphical power but downloading +100Gb for a game I might just play once feels like an incredible waste
games should have a lo-fi version where they use lower textures and less graphical features for the people that cannot actually see the difference in graphics after the ps2 era
Rainbow 6 Siege had to downgrade map assets because the skins take up too much space lol
Cosmetics is a wholly separate clown-show. Dota 2 used to be a few gigabytes in space. Now because of all the hats it’s like 30gb compressed.
There’s been a couple games I’ve decided to just not buy because the disk space requirement was too high. I don’t think they care much about a single lost sale, unfortunately.
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish8·6 hours ago
War Thunder (garbage game, don’t play) does this. You can choose to download higher quality textures. I don’t care, I haven’t noticed the difference
The snail yearns for your money
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish5·5 hours ago
Never. Always ftp
Same, I managed to get to BR 8.7 in the Soviet ground tree as a free-to-play, but I stopped making progress because the higher BR matches just aren’t that fun so I stick around in 3.7-4.0 and gain like no research points, lol
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish4·5 hours ago
Real. I have my first 8.0 in soviet, but its just a grind, no matter what br I play. 2.7 or so is my go to
war thunder bad?
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish6·6 hours ago
Garbage game. Yet I continue to play
War thunder bad.
damn, I like my little russian planes
There are so many games that don’t have this problem. How about you play those?
I’m finding the latest in visual advancements feels like a downgrade because of image quality. Yeah all these fancy technologies are being used but its no good when my screen is a mess of blur, TAA, artifacting from upscaling or framegen. My PC can actually play cyberpunk with path tracing but i can’t even begin to appreciate the traced paths WHEN I CAN’T SEE SHIT ANYWAY.
Currently binging forza horizon 4 which runs at 60fps on high on my steam deck and runs 165fps maxed on my PC with 8x msaa and it looks beautiful. And why is it beautiful? Its because the image is sharp where I can actually see the details the devs put into the game. Also half life alyx another game that is on another level with crisp and clear visuals but also ran on a 1070ti with no issues. Todays UE5 screen vomit can’t even compare
All games these days know is stutter, smeary image, dx12 problems and stutter
TAA, dof, chromatic aberration, motion blur, vignetting, film grain, and lens flare. Every modern dev just dumps that shit on your screen and calls it cinematic. Its awful and everything is blurry. And sometimes you have to go into an ini file because it’s not in the settings.
The gamers yearn for Forward+ rendering…
Yeah i think gaming as an industry is becoming ‘more specialized’ which is not necessarily good. All the engine developers are just working on very generic graphics stuff for like Unreal and Unity, rather than engine devs being a position at a company that makes games themselves, which can greatly optimize them for specific games.
I am lucky enough that I’m not that interested in high-specs AAA titles to begin with: of the 100+ games I’ve put on a DIY wishlist, I’d say less than 10 of them fall in this category. It’s mostly indie/retro titles, older titles or mid-budget.
I feel like that trend is actually past us. Maybe I haven’t followed gaming too closely but there doesn’t seem to be a benchmark game that is as overwhelmingly demanding, considering the landscape of tech during its time, as something like Crysis.
The most popular games nowadays don’t seem to be prohibitively demanding for commonly bought pcs. Maybe im wrong
nah, the problem is that they are demanding, but don’t push boundaries graphically at all - due to absolutely no optimisation.
i know i really shouldn’t play this slop, but Black Ops 6 legitimately looks worse that new-MW3 but requires me to massively turn settings down to achieve similar framerates/avoid hitching compared to the game it literally replaced. I may as well be playing a game from 10 years ago, and I have a 3070 and ryzen 5000 series cpu. barely anything i’ve played in the last 5 years looks “boundary-pushing amazing”, save for maybe Control which was more down to how it used certain effects i guess
i know i’m talking about activision, but it’s not unique to them and their shitty engine. Halo Infinite looked like ass and ran worse. I didn’t even play Starfield because lmao bethesda engine. Shit like Dead by Daylight even takes up 50gb. And i know they blow every single game launch, but given that Frostbite can look very good in some settings, BF2042 was an ugly, empty mess to the point that it killed the game harder that BFV. basically all AAA devs have regressed to inserting the slop into engines cobbled together 15 years ago and releasing games that would immediately crash if anything was compressed because treatpigs (like me) just accept 100gb installs and having to juggle what game you can play each week
There is some boundary pushing, but I feel like the time period between 2005 and 2015 was like… If you had a two year old graphics card you’d struggle with the latest games. Or something. Certainly, a 5 year old graphics card in 2005 would have been rough (as some people mention in this thread).
I think graphics cards have comparatively gotten more expensive though, compared to the rest of the computer.
There are some good videos out there that also explain how UE5 is an unoptimised mess. Not every game runs on UE5 but it’s the acceptable standard for game engines these days
It’s optimized around dev costs and not performance, sadly.
Can you link some, that sounds very interesting.
This is the main one I saw. It’s kind of an as for this guy’s game company, but clouds in a skybox shouldn’t cause performance issues https://youtu.be/6Ov9GhEV3eE
I found a YouTube link in your comment. Here are links to the same video on alternative frontends that protect your privacy:
I also would like to see them
My card isn’t even old
It’s about 4 years old, which is pretty old for an entry tier card to be running the latest triple AAA titles. My Radeon 7770 was only three years old when The Witcher 3 came out and it couldn’t hit minimum spec. Only two years for AC: Unity but that was especially demanding
I do think now is an awkward time where we’re shifting to new tech that isn’t quite ready for prime time, but it’s never going to be until we shift to it
Yea that’s fair, it’s just hard to think of it being outdated when I paid so much for it. Also it’s the first time I’ve experienced VRAM size being the chokepoint of what I can run, but maybe that’s just the new normal.
Yeah it feels arbitrary, especially given how cheap VRAM is. Common Nvidia L (not that they care given the stacks they’re making with data centers)
All of the boomer game devs that had to code games for like a 486 have now retired, replaced with people who nVidia or AMD can jangle shiny keys in front of to make their whole games around graphics tech like cloth physics and now ray tracing.
This is why solo or small team indie devs are the only devs I give a shit about. Good games that run well, are generally cheap, and aren’t bloated messes.
I just want to punch nazis why does it have to matter if the reflection of a pigeon off screen appears in Indiana Jones’ eyes??
Because it sells and they like monies
mobile 3060
If your using a laptop you need to do maintenance on it. If you don’t reapply thermal paste/pads and clean your fans your GPU and CPU will throttle.
2nd addressing the thread more generally if you have a 1440p or 4k monitor you will struggle. Lowing resolution is a very quick way to get performance. This is why the steam deck and switch are 720p screens. My laptop is a 3k monitor and have to downscale any game I play on it.
Also “can it run crysis?” was a thing almost 20 years ago now
What resolution is 3k?
“3K” doesn’t translate to any specific resolution, and it’s exclusively a thing in higher end laptops afaik. Anything from 2560x1600 to 2736 x 1824 to 2880x1620 to 3260x1834 or other totally random display resolutions are marketed under this label.
Totally unusable terminology.
People were saying this about Morrowind
They were kind of correct back then two with the amount of upgrading the industry would expect you to do. That just petered off there for a while, luckily. seems to be back in full force now though
That said, at least back then all the shit gave you actual functionalities as per graphics instead of like raytracing on retinas or some bullshit you’d never notice
I think that has to do with consoles: when a console generation is outdated mid or low range hardware that forces more general optimization and less added bullshit, especially when that generation drags on way too long and means devs are targeting what is basically a decade old gaming computer towards the end. When they’re loss leaders and there’s a shorter window between generations or upgraded same-generation versions, it means devs are only optimizing enough to run on a modern mid range gaming rig and specifically the console configuration of that.
Although there’s some extra stuff to it too, like the NVidia 10 series was an amazing generation of GPUs that remained relevant for like a decade, and the upper end of it is still sort of relevant now. NVidia rested on their laurels after that and has been extremely stingy with VRAM because their cash cow is now high end server cards for AI bullshit and they want businesses to buy $5000+ cards instead of <$1000 ones that would work good enough if they just had a bit more VRAM. GPUs have also gotten more and more expensive because of crypto and AI grifters letting NVidia know they can just keep raising prices and delivering less and people will still buy their shit, and AMD just grinning and following after them, delivering better cards at lower prices but not that much lower since they can get away with it too.
Yeah but they were right, Morrowind looks too good, every game should look like Cruelty Squad
Can confirm