I don’t understand the basis of the 24Hz limit rumor. My monitors are 144Hz, and if I limit them to 60Hz and move my mouse around I see fewer residual mouse cursors “after-images” than I do at 144Hz. That’s a simplified test that shows that the eye can perceive motion artifacts beyond 60Hz.
The eye can perceive LEDs that are rectified at 60Hz AC, it’s very annoying.
I think it’s the limit for what most people can see as jittery motion. You may be able to differentiate between higher FPS settings, but above 24 hertz most people shouldn’t be able to see discrete steps.
Your eyes are not digital. Nothing physical really is. Think about a camera flash. They can get well under 3.33ms, meaning over 300fps, and you can still see it clearly (and painfully). Same for a monitor, it also has a “response time”. It is how long it takes for a pixel to transition color. (Usually “gray to gray”, as in one shade of gray to another. Black to white would be longer, as is for eyes.)
So ofc you would see all the mice.
It’s also why motion blur is a thing, even though it was usually implemented incorrectly. Seeing every motion on a tv or monitor in perfect sharpness feels weird, because they are pictures not actual movements.
Your brain makes movements out of it all.
Anyway:
16 is minimum, 24 is good for most movies, 30 for slower games, 60 minimum for fps (75 and above for faster fps, even though i played xonotic on 45), 120 for vr.
24hz is the lower limit. People will perceive 24hz as a smooth sequence, especially with motion blur, while anything below it will start to look choppy. Of course humans can perceive higher frequencies. But 24hz became the standard because celluloid film is expensive especially in the early days of cinema. The less frames you need to shoot the less film you need to buy and develop. And film back then was probably not sensitive enough for the lower exposure times that come with higher frame rates.
No it can see much more. Bonus: your brain can ‘see’ more than 100hz too. Google bundesen tva. Source i worked on programs to measure it for my gfs phd. Also i play fps :D
My biggest gripe with cooking instructions is the non-specificity. “Stir pasta frequently”? How frequently? How continuously? Tell me in unit Hertz
I won’t accept my pasta at anything lower than 120Hz.
The human eye cannot see more than 24Hz, so why bother
Sooo…just curious how you explain this?
Just another of those internet image optical illusions. You won’t be fooling anyone on here 🧐
I don’t understand the basis of the 24Hz limit rumor. My monitors are 144Hz, and if I limit them to 60Hz and move my mouse around I see fewer residual mouse cursors “after-images” than I do at 144Hz. That’s a simplified test that shows that the eye can perceive motion artifacts beyond 60Hz.
The eye can perceive LEDs that are rectified at 60Hz AC, it’s very annoying.
I think it’s the limit for what most people can see as jittery motion. You may be able to differentiate between higher FPS settings, but above 24 hertz most people shouldn’t be able to see discrete steps.
That’s at least how I’ve come to understand it
Your eyes are not digital. Nothing physical really is. Think about a camera flash. They can get well under 3.33ms, meaning over 300fps, and you can still see it clearly (and painfully). Same for a monitor, it also has a “response time”. It is how long it takes for a pixel to transition color. (Usually “gray to gray”, as in one shade of gray to another. Black to white would be longer, as is for eyes.)
So ofc you would see all the mice.
It’s also why motion blur is a thing, even though it was usually implemented incorrectly. Seeing every motion on a tv or monitor in perfect sharpness feels weird, because they are pictures not actual movements.
Your brain makes movements out of it all.
Anyway: 16 is minimum, 24 is good for most movies, 30 for slower games, 60 minimum for fps (75 and above for faster fps, even though i played xonotic on 45), 120 for vr.
24hz is the lower limit. People will perceive 24hz as a smooth sequence, especially with motion blur, while anything below it will start to look choppy. Of course humans can perceive higher frequencies. But 24hz became the standard because celluloid film is expensive especially in the early days of cinema. The less frames you need to shoot the less film you need to buy and develop. And film back then was probably not sensitive enough for the lower exposure times that come with higher frame rates.
No it can see much more. Bonus: your brain can ‘see’ more than 100hz too. Google bundesen tva. Source i worked on programs to measure it for my gfs phd. Also i play fps :D
only 120hz?! I refuse to eat any pasta below 2.4ghz
Those are some loose standards… I only accept pasta at 1.21 Jiggawatts and 88mph.
Just imagine the chaos when you run the microwave at the same time!
What kind of dumb instructions are that?
Stirring exactly once is enough in most cases.
Maybe a graph of how strong the bond gets over time for 2 elements?