Our eyes and brains don’t perceive still images or movement in the same way as a computer. There is no simple analogy between our perception and computer graphics.
I’ve read that some things can be perceived at 1000 fps. IIRC, it was a single white frame shown for 1ms between black frames. Of course most things you won’t be able to perceive at that speed, but it certainly isn’t as simple as 30 fps!
The human brain evolved to recognize threats in the wilderness.
We see movement and patterns very well because early hominid predators were very fast and camouflaged, so seeing the patterns of their fur and being able to react to sudden movements meant those early people didn’t die.
But evolution doesn’t optimize. Things only evolve up to the point where something lives long enough to reproduce. Maybe over extremely long time spans things will improve if they help find mates, but that is all evolution does.
Your brain perceives things fast enough for you not to get eaten by a tiger. How fast is that? Who the fuck knows.
All the being said, I like higher HZ monitors. I feel like I can perceive motion and react to things more quickly if the frame rate is higher. The smoother something looks, the more likely I feel that I can detect something like part of a character model rounding a corner. But no digital computer is ever going to have analog “frame times”, so any refresh rate you think feels comfortable is probably fine.
That’s what I’ve heard. but also, the frequency of electricity in the USA is 60 Hz because Tesla found after experimentation that that’s the frequency where you don’t notice a lightbulb flickering anymore. Since the lightbulb flickers 120 times per second at 60 Hz, you could assume that a lower framerate than 120 fps is noticable.
Technically yes, but the more fluid the video is 8n the first place, the fewer gaps your brain has to fill in. On 30 fps you can see the moving image just fine, but your brain is always assembling the pieces and ignoring the gaps. The higher framerates reduce the number of gaps and makes a surprising difference in how smooth something looks in motion.
Also most monitors only go up to 60fps, and even if you have a fancy monitor that does, your OS probably doesn’t bother to go higher than 60 anyways. Even if the game itself says the fps is higher, it just doesn’t know that your pc/monitor isnt actually bothering to render all the frames…
Windows will do whatever frame rate the EDID reports the display as being capable of. It won’t do it by default, but it’s just a simple change in the settings application.
Macs support higher than 60 Hz displays these days, with some of the laptops even having a built-in one. They call it by some stupid marketing name, but it’s a 120 Hz display.
Linux requires more tinkering with modelines and is complicated by the fact that you might either be running X or Wayland, but it’s supported as well.
You just won the award for stupidest comment in the whole commentsection. That is just completely false and makes no sense in any way. Your computer doesn’t just skip calculations its told do do. Where did you even get this idea lmao
Can’t your eye only see like 30 frames per second in the center?
Our eyes and brains don’t perceive still images or movement in the same way as a computer. There is no simple analogy between our perception and computer graphics.
I’ve read that some things can be perceived at 1000 fps. IIRC, it was a single white frame shown for 1ms between black frames. Of course most things you won’t be able to perceive at that speed, but it certainly isn’t as simple as 30 fps!
The human brain evolved to recognize threats in the wilderness.
We see movement and patterns very well because early hominid predators were very fast and camouflaged, so seeing the patterns of their fur and being able to react to sudden movements meant those early people didn’t die.
But evolution doesn’t optimize. Things only evolve up to the point where something lives long enough to reproduce. Maybe over extremely long time spans things will improve if they help find mates, but that is all evolution does.
Your brain perceives things fast enough for you not to get eaten by a tiger. How fast is that? Who the fuck knows.
All the being said, I like higher HZ monitors. I feel like I can perceive motion and react to things more quickly if the frame rate is higher. The smoother something looks, the more likely I feel that I can detect something like part of a character model rounding a corner. But no digital computer is ever going to have analog “frame times”, so any refresh rate you think feels comfortable is probably fine.
No. Iirc around 200hz is where you get diminishing returns.
No. This is something console fanboys used to spread up when pc gamets showed off with their +30fps games
No, your brain just blurs an overwhelming amount of visual information into images as it sees fit. It doesn’t have a framerate limit.
That’s what I’ve heard. but also, the frequency of electricity in the USA is 60 Hz because Tesla found after experimentation that that’s the frequency where you don’t notice a lightbulb flickering anymore. Since the lightbulb flickers 120 times per second at 60 Hz, you could assume that a lower framerate than 120 fps is noticable.
Technically yes, but the more fluid the video is 8n the first place, the fewer gaps your brain has to fill in. On 30 fps you can see the moving image just fine, but your brain is always assembling the pieces and ignoring the gaps. The higher framerates reduce the number of gaps and makes a surprising difference in how smooth something looks in motion.
So you just explained why it’s actually a no.
Also most monitors only go up to 60fps, and even if you have a fancy monitor that does, your OS probably doesn’t bother to go higher than 60 anyways. Even if the game itself says the fps is higher, it just doesn’t know that your pc/monitor isnt actually bothering to render all the frames…
my man, just because you’ve never seen the refresh rate option in the monitor settings doesn’t mean it hasn’t been there since basically forever
This is blatantly false.
Windows will do whatever frame rate the EDID reports the display as being capable of. It won’t do it by default, but it’s just a simple change in the settings application.
Macs support higher than 60 Hz displays these days, with some of the laptops even having a built-in one. They call it by some stupid marketing name, but it’s a 120 Hz display.
Linux requires more tinkering with modelines and is complicated by the fact that you might either be running X or Wayland, but it’s supported as well.
To add on to this. There are phones coming out now with 90+hz screens. They are noticably smoother than the 60hz ones. My current phone does 120hz.
Yeah the OS can and will shove out frames as fast as the hardware can support them
Wayland picks up my 155, 144, and 60 hz monitors and sets them to the correct refresh rate on it’s own nowadays, so it’s even more painless.
You just won the award for stupidest comment in the whole commentsection. That is just completely false and makes no sense in any way. Your computer doesn’t just skip calculations its told do do. Where did you even get this idea lmao
What the fuck? Help us understand: which OS doesn’t limit fps and what do you see when you check frame rates with your own eyes?
That was true before high framerate monitors were a thing, which was around 10+ years ago…
no it wasn’t true back then either, CRTs have been doing 100hz and more decades ago and it was very much supported by OSes and games