My brother or sister in pixels, this is not the same. I’m not a graphics snob. I still play pixelated, barely discernible nonsense games. When I updated from 30 to 144, it was a whole new world. Now even 60 can feel sluggish. This is not a graphical fidelity argument. It’s input and response time and motion perception. Open your mind, man. Accept the frames.
And that matters for certain games, a lot. But it doesn’t functionally matter at all for others. Same as the transition to polygons. My point, which I thought I stated clearly, was not “FPS BAD!!”, it was “FPS generally good, but stop acting like it’s the single most important factor in modern gaming.”
Simply put, if everything was 144fps then it would be easier on the eyes and motions would feel more natural. Even if it’s just navigating menus in a pixel style game.
Real life has infinite frames per second. In a world where high fps gaming becomes the norm, a low 24 fps game could be a great art style and win awards for its ‘bold art direction’.
That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.
Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.
What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.
From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).
This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.
Yeah, as much as I can give a shit about ray tracing or better shadows or whatever, as a budget gamer, frame rate is really fucking me up. I have a very low end PC so 60 is basically max. Moving back to 30 on the PS4 honestly feels like I’m playing PS2. I had the [mis]fortune of hanging out at a friends house and playing his PC rig with a 40 series card, 240hz monitor, etc, and suffice it to say it took a few days before I could get back to playing on my shit without everything feeling broken.
That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS; the only actual upside of running higher FPS rate is that you don’t go below 60 in case the game starts to lag for whatever reason. Now, you may be one of the few who actually see perceive changes better than normal, but for the vast majority, it’s more or less just placebo.
That’s just wrong. I couldn’t go back to my 60Hz phone after getting a 120Hz new one. It’s far from placebo, and saying otherwise is demonstrably false.
One of most insufferable aspects of video game culture (pc gaming in particular) other than the relentless toxic masculinity from insecure nerds is an obsessive focus on having powerful hardware and shitting on people who think they are getting a good experience when they don’t have good hardware.
The point is to own a computer that other people don’t have so you can play a game and get an experience other people don’t have, the point isn’t to celebrate a diversity of gaming experiences and value accessibility for those without the money for a nice computer. It really doesn’t matter if these people are intending to do this consciously or not, this is a story as old as time. It is the same exact bullshit as guitar people who only think special exotic or vintage guitars are beautiful, claim to absolutely love guitar but never once in their life have stopped to think about how much more beautiful it is that any random chump can get an objectively wonderful sounding guitar for a couple of hundred dollars than it is that they own some stupid special edition guitar with a magic paint job that cost as much as my shitty car.
Good thing these people don’t fully dictate the flow of all of video game development, but they will never ever learn because this is the kind of pattern that arises not from conscious intention but rather from people uninterested in critically examining their own motivations.
It is the same damn nauseating thing with photography too….
it depends on if its a good 30 or not. if inputs are quick and responsive, and the framerate stays at 30 then its fine. but if my device is struggling to run the game and its stuttering and unresponsive then its awful
sm64 comes to mind as the best 30fps experience ive had, and i am spoiled rotten on high refresh rate games
All you FPS kids are just doing the new version of “eww that game has 2d graphics; polygons or bust!!” from the PlayStation era.
Yes, progress is cool and good, but no it’s not the end-all be-all and no not every game has to have bleeding edge FPS to be good.
Like, we’re literally already done this shit guys; can’t we just learn from the past?
My brother or sister in pixels, this is not the same. I’m not a graphics snob. I still play pixelated, barely discernible nonsense games. When I updated from 30 to 144, it was a whole new world. Now even 60 can feel sluggish. This is not a graphical fidelity argument. It’s input and response time and motion perception. Open your mind, man. Accept the frames.
And that matters for certain games, a lot. But it doesn’t functionally matter at all for others. Same as the transition to polygons. My point, which I thought I stated clearly, was not “FPS BAD!!”, it was “FPS generally good, but stop acting like it’s the single most important factor in modern gaming.”
Simply put, if everything was 144fps then it would be easier on the eyes and motions would feel more natural. Even if it’s just navigating menus in a pixel style game.
Real life has infinite frames per second. In a world where high fps gaming becomes the norm, a low 24 fps game could be a great art style and win awards for its ‘bold art direction’.
Not really. Real life is as many FPS as your eyes can perceive, which is about 60 (though it can vary somewhat between people). See: https://www.healthline.com/health/human-eye-fps#how-many-fps-do-people-see
That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.
Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.
What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.
From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).
This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.
Yeah, as much as I can give a shit about ray tracing or better shadows or whatever, as a budget gamer, frame rate is really fucking me up. I have a very low end PC so 60 is basically max. Moving back to 30 on the PS4 honestly feels like I’m playing PS2. I had the [mis]fortune of hanging out at a friends house and playing his PC rig with a 40 series card, 240hz monitor, etc, and suffice it to say it took a few days before I could get back to playing on my shit without everything feeling broken.
That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS; the only actual upside of running higher FPS rate is that you don’t go below 60 in case the game starts to lag for whatever reason. Now, you may be one of the few who actually see perceive changes better than normal, but for the vast majority, it’s more or less just placebo.
That’s just wrong. I couldn’t go back to my 60Hz phone after getting a 120Hz new one. It’s far from placebo, and saying otherwise is demonstrably false.
Do you think 60 FPS just got released or something
The discussion is about 144
144 FPS isn’t even bleeding edge, there are monitors with refresh rates higher than that.
One of most insufferable aspects of video game culture (pc gaming in particular) other than the relentless toxic masculinity from insecure nerds is an obsessive focus on having powerful hardware and shitting on people who think they are getting a good experience when they don’t have good hardware.
The point is to own a computer that other people don’t have so you can play a game and get an experience other people don’t have, the point isn’t to celebrate a diversity of gaming experiences and value accessibility for those without the money for a nice computer. It really doesn’t matter if these people are intending to do this consciously or not, this is a story as old as time. It is the same exact bullshit as guitar people who only think special exotic or vintage guitars are beautiful, claim to absolutely love guitar but never once in their life have stopped to think about how much more beautiful it is that any random chump can get an objectively wonderful sounding guitar for a couple of hundred dollars than it is that they own some stupid special edition guitar with a magic paint job that cost as much as my shitty car.
Good thing these people don’t fully dictate the flow of all of video game development, but they will never ever learn because this is the kind of pattern that arises not from conscious intention but rather from people uninterested in critically examining their own motivations.
It is the same damn nauseating thing with photography too….
I’d rather not get motion sickness thanks.
it depends on if its a good 30 or not. if inputs are quick and responsive, and the framerate stays at 30 then its fine. but if my device is struggling to run the game and its stuttering and unresponsive then its awful
sm64 comes to mind as the best 30fps experience ive had, and i am spoiled rotten on high refresh rate games