I am developing a game for the first time, but I wonder what the playing time is based on. Is it based on watches or relies on frames? (Note: I'm not sure that “playing time” is the right word here, correct me if it is not)
To be more clear, imagine these scenarios:
- Computer 1 runs fast, up to 60 frames per second
- Computer 2 runs slowly, no more than 30 frames per second
On both computers, the same game is played in which the character goes at the same speed.
If the game time is based on frames, the character will move twice as fast on computer 1. On the other hand, if the game time was based on the actual time, computer 1 would show twice as many frames, but the character would move as fast as on computer 2 .
My question is: what is the best way to handle playing time and what are the advantages and disadvantages?
language-agnostic
Harmen
source share