Currently trying to refund the new Indiana Jones game because it’s unplayable without raytracing . My card isn’t even old, it’s just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn’t look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.
that one is atrocious but another thing i also find nasty is the amount of disk space new games need. sure, buying more disks is way cheaper than getting more graphical power but downloading +100Gb for a game I might just play once feels like an incredible waste
games should have a lo-fi version where they use lower textures and less graphical features for the people that cannot actually see the difference in graphics after the ps2 era
Rainbow 6 Siege had to downgrade map assets because the skins take up too much space lol
Cosmetics is a wholly separate clown-show. Dota 2 used to be a few gigabytes in space. Now because of all the hats it’s like 30gb compressed.
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish16·1 month ago
War Thunder (garbage game, don’t play) does this. You can choose to download higher quality textures. I don’t care, I haven’t noticed the difference
The snail yearns for your money
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish8·1 month ago
Never. Always ftp
Same, I managed to get to BR 8.7 in the Soviet ground tree as a free-to-play, but I stopped making progress because the higher BR matches just aren’t that fun so I stick around in 3.7-4.0 and gain like no research points, lol
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish7·1 month ago
Real. I have my first 8.0 in soviet, but its just a grind, no matter what br I play. 2.7 or so is my go to
war thunder bad?
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]@hexbear.netEnglish10·1 month ago
Garbage game. Yet I continue to play
War thunder bad.
damn, I like my little russian planes
There’s been a couple games I’ve decided to just not buy because the disk space requirement was too high. I don’t think they care much about a single lost sale, unfortunately.
it’s not just texture quality, game devs also throw in TONS of unused assets that just live on the disk and don’t do anything. i think GTA V is about half just unused assets.
There are some good videos out there that also explain how UE5 is an unoptimised mess. Not every game runs on UE5 but it’s the acceptable standard for game engines these days
Can you link some, that sounds very interesting.
I also would like to see them
This is the main one I saw. It’s kind of an as for this guy’s game company, but clouds in a skybox shouldn’t cause performance issues https://youtu.be/6Ov9GhEV3eE
I found a YouTube link in your comment. Here are links to the same video on alternative frontends that protect your privacy:
It’s optimized around dev costs and not performance, sadly.
I want my games to be able to be rendered in software, I want them to be able to run on a potato from the early 2000s and late 90s, is this too much for a girl to ask for
Todd Howard made Morrowind run on 64MB of RAM in a cave. With a box of scraps.
I’m finding the latest in visual advancements feels like a downgrade because of image quality. Yeah all these fancy technologies are being used but its no good when my screen is a mess of blur, TAA, artifacting from upscaling or framegen. My PC can actually play cyberpunk with path tracing but i can’t even begin to appreciate the traced paths WHEN I CAN’T SEE SHIT ANYWAY.
Currently binging forza horizon 4 which runs at 60fps on high on my steam deck and runs 165fps maxed on my PC with 8x msaa and it looks beautiful. And why is it beautiful? Its because the image is sharp where I can actually see the details the devs put into the game. Also half life alyx another game that is on another level with crisp and clear visuals but also ran on a 1070ti with no issues. Todays UE5 screen vomit can’t even compare
All games these days know is stutter, smeary image, dx12 problems and stutter
TAA, dof, chromatic aberration, motion blur, vignetting, film grain, and lens flare. Every modern dev just dumps that shit on your screen and calls it cinematic. Its awful and everything is blurry. And sometimes you have to go into an ini file because it’s not in the settings.
All of the boomer game devs that had to code games for like a 486 have now retired, replaced with people who nVidia or AMD can jangle shiny keys in front of to make their whole games around graphics tech like cloth physics and now ray tracing.
I just want to punch nazis why does it have to matter if the reflection of a pigeon off screen appears in Indiana Jones’ eyes??
Because it sells and they like monies
This is why solo or small team indie devs are the only devs I give a shit about. Good games that run well, are generally cheap, and aren’t bloated messes.
I just want ps2-level graphics with good art direction (and better hair, we can keep the nice hair) and interesting gameplay and stories. Art direction counts for so much more than graphics when it comes to visuals anyway. There are Playstation 1 games with good art direction that imo are nicer to look at than some “graphically superior” games.
What hair in modern games looks like
when u see a Homo Sapiens for the first time
Yeah i would much rather a hairstyle be a single solid texture than whatever the fuck this “HAIRFX individual hair rendering 9000” bullshit is, that always ends up looking like trash
I got a special fucking bone to pick with Cities Skylines 2. I’ve never had a game look like such vaseline-smeared ass while making my computer sound like it’s about to take off. It’s a shame because it’s definitely come a long way as a game and has some really nice buildings now, but to play it I start to get nervous after like half an hour and have to let my computer cool down, fuck that shit.
There’s a number of YouTube videos examining how CS2 was designed in such a shockingly bad way to murder your GPU
This is part of why I’ve pretty much stopped following mainstream releases. Had to return Space Marine 2 because it would not stop crashing and the low settings looked like absolute dogshit
if a game can’t run on everything people have run Doom on, i don’t want to play it
yes this includes the digital pregnancy test and the parking ticket validator
duke nukem: that’s a lot of
wordsVRAM. too bad i’m not buyin it.seriously, i hate this shit. i have a 1080 ti that i got used several years ago when the market had hit a bit of a lull and it’s got some firmware bug that stops it from running most modern games, even ones it h as enough vram to run. this is why indie games and old games that people are still making mods or private server sets for like cod4 are so great.
My 1080 kept having performance issues with each update. I had to revert to an older driver in order to get anything to work. Nvidia and whatever game devs blamed each other for the issue, so I haven’t figured out why it was happening.
My CPU is 12 years old and my GPU 7. So yeah… I’m gonna stick with indie and older games.
The gamers yearn for Forward+ rendering…
Yeah i think gaming as an industry is becoming ‘more specialized’ which is not necessarily good. All the engine developers are just working on very generic graphics stuff for like Unreal and Unity, rather than engine devs being a position at a company that makes games themselves, which can greatly optimize them for specific games.
People were saying this about Morrowind
Yeah but they were right, Morrowind looks too good, every game should look like Cruelty Squad
They were kind of correct back then two with the amount of upgrading the industry would expect you to do. That just petered off there for a while, luckily. seems to be back in full force now though
That said, at least back then all the shit gave you actual functionalities as per graphics instead of like raytracing on retinas or some bullshit you’d never notice
I think that has to do with consoles: when a console generation is outdated mid or low range hardware that forces more general optimization and less added bullshit, especially when that generation drags on way too long and means devs are targeting what is basically a decade old gaming computer towards the end. When they’re loss leaders and there’s a shorter window between generations or upgraded same-generation versions, it means devs are only optimizing enough to run on a modern mid range gaming rig and specifically the console configuration of that.
Although there’s some extra stuff to it too, like the NVidia 10 series was an amazing generation of GPUs that remained relevant for like a decade, and the upper end of it is still sort of relevant now. NVidia rested on their laurels after that and has been extremely stingy with VRAM because their cash cow is now high end server cards for AI bullshit and they want businesses to buy $5000+ cards instead of <$1000 ones that would work good enough if they just had a bit more VRAM. GPUs have also gotten more and more expensive because of crypto and AI grifters letting NVidia know they can just keep raising prices and delivering less and people will still buy their shit, and AMD just grinning and following after them, delivering better cards at lower prices but not that much lower since they can get away with it too.
Can confirm
The new indiana jones is actually pretty decently optimized, like I run it at 1080p all high/ultra settings on my rtx 3060 12gb, with DLAA downscaling enabled at a mostly locked 60fps. Like it is leagues better than any UE5 game, it’s just the hard VRAM requirements that suck.
I feel like a lot of the issues game graphics have nowadays is just that GPU prices have been ridiculously inflated over the last two decade because of crypto/ai. Like it is not surprising that devs will follow the newest trends and technologies when it comes to graphics, but the hardware needs of raytracing and global illumination and the likes are just too high for what gpu performance/dollar you can get in 2024. I just recently upgraded from an AMD RX480 to a used Nvidia RTX 3060 12GB (which seemed to be the best bang for the buck, an RTX 4060 would have been much more expensive for not a lot more performance), and that upgrade gets you maybe double performance in your games, for a GPU that is a whole seven years newer (and no VRAM upgrade at all when you get the base model). These cards just simply shouldn’t cost as much as they do. If you don’t have unlimited money to spend, you are going to have a much worse experience today compared to half a decade or a decade ago.