In-game timer
1 year ago

Hey, developer here. I've had a few people comment on the in-game timer reporting shorter times than expected, and I wanted to both explain how it works and ask if people would like to see changes in an update to the game.

The timer increments every gameplay frame and assumes exactly 60 (NTSC) or 50 (PAL) frames per second, reporting frames (not centiseconds) in the lowest portion. It starts when you make any movement after spawning, and it pauses on death. This means it doesn't measure the death animation. The entire untimed span, covering the death, fade out, level load, and fade in, lasts 131 frames (plus 1 lag frame during level load) on both NTSC and PAL. How much more isn't timed depends on how long before you start moving, but it's 131 if you move right away.

My intent was just to time actual gameplay, but it's clear that deaths near a checkpoint have very little cost, and in fact, crashing at high speed after touching a checkpoint can be faster than slowing down. I figured people would probably be playing competitively on expert, but given the difficulty of the game, people have gravitated toward normal, so it might make sense to start timing deaths to force a time cost.

I'm planning on releasing a minor update probably next month and am looking for feedback on whether the timer should be changed to include some or all of the untimed portion. I'm still planning to have the timer only start after your first movement when respawning, as a compromise for the game lacking a pause feature (and in fact, even gravity doesn't work until you move). I could make the game time all of the sections mentioned above, currently totaling 131 frames, though NTSC vs PAL complicates this somewhat; I intend to shorten the PAL death animation (pre-fade out) to 90 frames so it lasts the same real-world time compared to NTSC's 108 frames, but I can't really do much about the duration of the palette fades or level load. That said, PAL and NTSC times aren't directly comparable anyway because of meaningful gameplay differences (which I can elaborate on if there is interest), so perhaps that point is moot. I also don't intend to count lag frames, which have the potential to vary, though I suspect in practice there is always just 1 hidden lag frame when loading a level and that's it. [Edit: I think I have a minor preference for just timing the 108 frame death period, as that best aligns with the idea of measuring gameplay time, but trying to better match real-world time is also reasonable.]

Note that old times would be easily converted to the new time system by adding a fixed amount per crashed ship. The ending screen would also indicate the game version. Other minor changes being made to the game for the update won't affect in-game timing.

So, would people like the timer to measure deaths? If so, should the entire 131 frames of death, fade out, level load, and fade in be counted, or a shorter period?

Thanks for any feedback!

Edited by the author 1 year ago
New Brunswick, Canada

i think it would be better if the timer is as RTA as possible, but it's not a big deal as long as it's the same timer for everyone. so if you die 3 times for example, how much time do you add to get a more accurate time? 3 x 131 frames? either way it's not a huge deal. wait a second, are you saying that if your time is 2:36.40 for example, that's 40/60 and not 40/100?

Edited by the author 1 year ago
New Brunswick, Canada

can we get milliseconds for the IGT? that would save some troubles

Yes, if we decide to time all 131 non-lag frames from death to respawn, and you died 3 times, you would add 3 x 131 frames to the time. (And if you wanted the lag frame, too, it would be 3 x 132, but I don't recommend that.)

And yes, in 2:36.40, the fractional part does indeed mean 40/60 (NTSC) or 40/50 (PAL) seconds, not 40 centiseconds. I could make the timer use centiseconds or milliseconds, but on NTSC, this means there's rounding and so I opted for the precise frame count rather than an imprecise rounded number that makes it harder to get back to the game's real unit of time, which is frames. I also feel like milliseconds implies some kind of wall-clock precision that the game doesn't have; like probably all retro games with timers, we already assume a false framerate of exactly 60 FPS, while the NTSC NES runs just under 60.1 FPS, a difference of 1 frame every 10 seconds. (Having fractional seconds in some other base also isn't that uncommon, and we use base 60 for seconds and minutes, too!)

Is the issue that speedrun.com requires milliseconds? I could presumably change the game to use fractional seconds out of 100 or 1000, and provide the lookup table to convert old-version frames to new-version milliseconds. I still don't love the rounding on NTSC, but the timer is for you guys, so it should be what works best for you.

Edited by the author 1 year ago
New Brunswick, Canada

speedrun.com doesn't require milliseconds but it helps separating runners that are close, at the same time i think it's ok to incorrectly use centiseconds because it doesn't change the order on the leaderboard

I asked around to see what sub-second units speedrun.com offers and am disappointed to learn it's apparently just whole seconds, deciseconds, centiseconds, and milliseconds.

I can change the timer to centiseconds or milliseconds in the next version if need be, but I wish speedrun.com would just allow other units. Note that converting from frames to a speedrun.com-friendly millisecond count is as easy as dividing the frame component by 60.

Game stats
Followers
6
Runs
24
Players
6
Latest threads
Posted 1 year ago
5 replies