Forums  /  Speedrunning  /  Frame rates

To anyone who plays run and gun/platform shooter style games: if your frame rate was fluctuating between 59-61 quite a lot during a run, as opposed to a solid stable 60fps, would it possibly give you any sort of advantage at any points? I'm thinking you would end up with a bunch of points in game where you'd have slightly longer to react to things/deal with stuff.

If anyone else wants to chime in as a mod/runner of their game on what tolerances they allow in frame rates for such console games I'd be most grateful.


really depends how the frame fluctuation is handled,
some games skip the frames, in which case you'd actually lose time to react.


unless you have a display that support over 60hz, anything above 60fps that isn't a factor of 60 will cause more issues than good and even with factors of 60 you'd often still get screen tearing instead of any real advantage. The difference between 1/60 and 1/61 isn't even a millisecond btw, so a fluctuation that tiny will literally do nothing at all. For some games that isn't even a single gametick, so reacting faster, even if it's by less than a millisecond will make no difference at all.

Edit: of course there are special cases of engines that run smoother without v-sync and just happen to go well beyond 60fps on anything from the current decade.


Assuming you have a display that can even handle higher than 60fps (which the vast majority of people don't), the difference between exactly 60 and 61fps theoretically gives you an extra ~0.0003 seconds to react to something, and 60 to 59 means you lose ~0.0003 seconds of reaction time. In practice, this doesn't matter at all. Humans literally cannot react to things that quickly (the average visual reaction time is about a quarter of a second, and even in a very focused test environment with training it doesn't get much lower than that), and even if you're superhuman, the input lag on your TV/monitor would completely negate any advantage you may have gotten.

I definitely wouldn't worry about framerate fluctuations in regards to speedrunning. The framerate of something as complex as a video game (especially a modern AAA game, especially on a modern operating system while sharing processing power with other applications/services) will always vary slightly for reasons completely outside of the player's control. There's no such thing as a perfectly stable 60fps.


Hard disagree on the above responses saying this is a negligible problem.

You're looking at framerate differences in specific instances, but it becomes a problem when you look at it for the entire run because you get huge runtime differences. If noticeable framerate differences are occurring enough to point out (laggy sections notwithstanding) then it should be assumed that it occurs very frequently throughout the game.

Sonic Adventure DX for a long time was a NIGHTMARE to standardise because the optimal platform to run on was the 2003 PC Disc release. This version of the game runs anywhere from 60-63FPS, and constantly fluctuates to minor degrees.

Now, this "doesn't matter" for any specific section of the game, but across an entire run you're looking at a potential five percent difference in run length. One runner's 30 minute run is another runner's 31.5 minute run. Nowadays this isn't a concern due to the game using Ingame Time, and this in my opinion is the best way to address variable framerate in regards to run timing; an internal timer will progress at the same rate regardless of visual framerate.

For the specific question at hand, abusing framerate difference to pull off certain tricks and whatnot, I don't think it's something you could easily enforce or really control. There will be many methods in a lot of games to deliberately introduce lag to slow the game down if you need to do something particularly precise, or otherwise just pause buffer it.

ShikenNuggetsShikenNuggets likes this. 

Originally posted by DrakodanOne runner's 30 minute run is another runner's 31.5 minute run

This is a good point that I didn't consider, though it's worth noting that this can only happen in games where the game speed is framerate-dependent, which is unfortunately the case for a lot of older games, but most modern games are framerate-independent (making your game framerate-dependent is super lazy, especially if the game doesn't even have a proper framerate cap, as appears to be the case for Sonic Adventure DX). All the games I run are framerate-independent so I didn't consider this.

Nonetheless, as you noted, there isn't really a good way to handle these sorts of situations from a moderation perspective. You definitely wouldn't want to start invalidating runs just because the game is poorly made and lagged too much (or didn't lag enough) or anything like that.


Depends a lot on how the game was made, for most modern games it shouldn't matter. For some it can cause anomalies with physics, collision detection, or other systems while not affecting the actual speed of the game. But for older games where speed is directly tied to framerate it would be a consideration. For example in my main game Castlevania: Symphony of the Night it turned out that even with a relatively modern release on the Xbox 360 with a progressive hdmi signal old broadcasting standards caused slight differences in game performance. North American Xboxes still output hdmi 60 at a refresh rate of 59.94Hz, while the European ones output 60Hz. The port of the game didn't really take this into account so this does result in a very minor advantage to EU systems. (about a second difference per 20 minutes)