How Many Frames Per Second Do You Need?
Technically, both video games and movies are just a series of still frames (individual pictures), that are shown in quick succession. It’s only due to our human perception and brain, that we perceive an illusion of continuous movement - if that playback of individual images is fast enough.
But how fast is fast enough?
Movies And Biology
The early movie industry found out, that 24 images per second is just barely enough to trick the human brain into perceiving continuous motion. At lower framerates we still can make out the motion, but it will feel choppy and unnatural - and that can be distracting.
So, 24 fps covers persistence of vision, but if the images are shown as bright light, and the pauses between images are dark - then this framerate would still feel incredibly flickery. Pre-digital movie projectors solved this by using double-bladed shutters, which made each still image flash up brightly on the screen twice, but those 48 flashes per second still felt slightly flickery. Only when they went for triple-pladed shutters and showed each still frame three times in a row (effectively 72fps), did the flicker become entirely unnoticeable. In fact, 60fps is already enough for the vast majority of people, to overcome the flicker fusion threshold.
24 frames per second are good enough for movies - as that framerate will be perceived as continuous motion, as long as it’s filmed correctly. That includes not doing too fast camera pans, not allowing too fast motion too close to the camera, managing shutter speed to get just the right amount of motion blur, and so on, and so forth. Necessity became virtue - and today many of these tricks required to make 24fps work, are now considered essential elements of the expressive language of cinema, and central to create a cinematic look. So much so, that cinema purists will totally reject faster framerates, because then the movie then just looks and feels “wrong”.
Video Games And Cargo Cults
30 fps are not good enough for most video games though. Even the original NES already did run games at 50 or 60 fps (depending on whether it was a PAL or NTSC unit). There wasn’t any motion blur in video games back then, and without that, any faster motion does look quite choppy in 30fps - even at those low resolutions (and small screens) typical of the time. Some slower games could get away with only updating the game logic every other frame, thus running internally at 30hz while still outputting 60fps (by showing each frame twice). Most platformers made use of 60fps though, because it simply was needed to give a smooth feel and responsive controls.
Some newer consoles could get away with 30 fps in some games as well - if movement wasn’t too fast, camera turn speed was limited by analog sticks, the game was played on a TV with greater sitting distance than a typical PC monitor, maybe some motion blur being used, etc. - but even then it always was a bit of a crutch, in my opinion. But with limited hardware power, going for 60fps would have required to quite noticably reduce the visual fidelity of the graphics. So, fps regularly got sacrificed in favor of visuals.
But 60fps is just the perfect sweet spot. It’s smooth, flicker-free and still works even when the player can turn the camera quite fast and abruptly with the mouse. It has another big benefit, and that is low input latency. When running at 30 fps, your input to the game is read 30 times each second, and the effects of that input are shown on the next frame - which means at least 33 milliseconds later - possibly more, if you factor in additional latency caused by mouse&drivers or the monitor. Running 60fps brings that down to 17 milliseconds. In a fast-paced game where twitch-skills matter, this does make a noticable difference. And that’s also the reason, why even in the early days, people playing first person shooters competitively, liked to run the game at framerates even greater than 60fps - even if their monitors couldn’t show more than 60. Despite those additional images never being shown - the decrease in latency felt worth it for high-end players. And of course, not so high-end players would then go do the same thing. But be careful - there’s also a bit of a cargo-cult going on there, with some fps priests singing praises far beyond what the actually achievable benefits of higher framerates are. And of course manufacturers and sellers of monitors and video cards like very much to lean into that with their advertising.
Running at fps higher than 60 does comes with increased motion clarity - a small advantage that can also be relevant to games that don’t rely on twitch skills. Fast moving objects can appear sharper and with more clarity - if the framerate is higher. That benefit of course goes completely out the window, if you turn on motion blur - or have to sacrifice resolution to achieve it. That just totally counteracts it. But even with motion blur turned off - the actual gameplay benefits you get from increased motion clarity aren’t all that massive. Which also makes me question the value of AI frame generation. Maybe there’s an argument to be made for using framegen to push a game from 30 fps to 60 fps - to get smooth animation in a game where the hit you take to input latency doesn’t matter. And edge-case scenario. But I honestly can’t come up with any good use-case for multi-framegen at all. But I digress…
In Conclusion
In general the thing is - you are getting highly diminished returns from pushing fps to ever higher numbers. Going from 30 to 60 is stull huge. But then, from 60 to 120 gives you far less returns. And going from 120 to 240… many people aren’t even able to tell any difference there. Sure, if you know what exactly to look for, you might be able to tell the difference - but are you really gonna notice that while playing? Maybe a young player competitively playing a twitchy e-sports game over LAN (no added internet latency) does - but for an old, near-sighted fart like me playing a turn-based RPG like Baldur’s Gate 3, anything past 60fps is basically a waste. Also, getting a stable, consistent framerate is ultimately worth more, than getting a higher “average” framerate, but with way more frame-to-frame variation.
So here’s the gist of it all:
- 24/30 fps is perfectly fine for movies and some slower-paced games where the camera doesn’t move too much.
- 45 fps can be an okay substitute for 60 on smaller screens - like the SteamDeck for example.
- 60 fps is the sweet spot for most games. Very smooth and also suitable for faster camera movements, great low input latency.
- At 90 fps you get diminishing returns kicking in hard - but might still be worth it for some fast-paced, twitchy, competitive first person shooters.
- 120 fps maybe, if you’re a professional e-sports athlete… but even then, only in certain specific games. For anything played online, the effects of higher framerates will most likely be negligible next to the effects of network latency.
- And 240 and more fps… at that point, it’s most likely just placebo with zero actual benefit
That being said, there are of course inter-individual differences. Some people are more sensitive than others - and increasing age generally will make you slower in both your reactions and signal processing. So, some people might totally be fine with movie-like 30fps - while some others might feel that 120fps is fully worth it. But 60 fps is just that perfect sweet spot, still clearly noticable for the vast majority of people, beneficial for the vast majority of games, and right before the point of greatly diminishing returns.