My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • jsdz
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    VGA might’ve done that to get better resolution at 60 Hz, but I’m pretty sure earlier systems including CGA and the Amiga did 60 fps non-interlaced video at lower resolutions. At least the Amiga also had a higher-resolution interlaced video mode, but it was mostly used for displaying impressive-looking static images.