My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • MrFunnyMoustache
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 year ago

    Since you are familiar with video production, I don’t need to explain the basics of how cameras work, so I’ll jump right into the relevant part: shutter speed. Shutter speed is responsible for motion blur which makes videos appear smooth. In video games, that shutter speed is basically zero time, meaning there is no motion blur. You can replicate the effect if you shoot a 30FPS video with a much faster shutter speed, any movement will look choppy, especially faster moving objects.

    Games have the option to add motion blur, but it increases delay, making the games feel even more sluggish.

    In addition to the feeling, playing games is an interactive activity, so you need to quickly react to the stuff that goes on. The higher frame rate of the game will reduce that delay, allowing you to actually play properly. This is especially important in competitive games, or any games that require you to react quickly, or nail the timing for it. Personally, 60 is barely enough, but for good experience playing games is at least at 100 FPS with variable refresh rate technology, or 120 FPS without.