Simulated time in a game loop using C ++

I create a 3d game from scratch in C ++ using OpenGL and SDL on Linux as a hobby and learn more about this programming area.

Interest in a better way to simulate time during a game. Obviously, I have a loop that looks something like this:

void main_loop() { while(!quit) { handle_events(); DrawScene(); ... SDL_Delay(time_left()); } } 

I use SDL_Delay and time_left () to maintain a frame rate of about 33 frames per second.

I thought I just needed a few global variables, like

 int current_hour = 0; int current_min = 0; int num_days = 0; Uint32 prev_ticks = 0; 

Then a function like:

 void handle_time() { Uint32 current_ticks; Uint32 dticks; current_ticks = SDL_GetTicks(); dticks = current_ticks - prev_ticks; // get difference since last time // if difference is greater than 30000 (half minute) increment game mins if(dticks >= 30000) { prev_ticks = current_ticks; current_mins++; if(current_mins >= 60) { current_mins = 0; current_hour++; } if(current_hour > 23) { current_hour = 0; num_days++; } } } 

and then call the handle_time () function in the main loop.

It compiles and runs (using printf to write the time to the console at the moment), but I wonder if this is the best way to do this. Are there any simpler ways or more efficient ways?

+4
source share
4 answers

I already mentioned this in other game-related threads. As always, follow Glenn Fiedler's suggestions in the Game Physics section.

What you want to do is use the constant timestamp you get by accumulating time deltas. If you want 33 updates per second, then your constant time interval should be 1/33. You can also call it the refresh rate. You must also separate the logic of the game from rendering, since they do not belong to each other. You want to be able to use the low refresh rate when rendering as fast as the machine allows. Here is a sample code:

 running = true; unsigned int t_accum=0,lt=0,ct=0; while(running){ while(SDL_PollEvent(&event)){ switch(event.type){ ... } } ct = SDL_GetTicks(); t_accum += ct - lt; lt = ct; while(t_accum >= timestep){ t += timestep; /* this is our actual time, in milliseconds. */ t_accum -= timestep; for(std::vector<Entity>::iterator en = entities.begin(); en != entities.end(); ++en){ integrate(en, (float)t * 0.001f, timestep); } } /* This should really be in a separate thread, synchronized with a mutex */ std::vector<Entity> tmpEntities(entities.size()); for(int i=0; i<entities.size(); ++i){ float alpha = (float)t_accum / (float)timestep; tmpEntities[i] = interpolateState(entities[i].lastState, alpha, entities[i].currentState, 1.0f - alpha); } Render(tmpEntities); } 

This handles flaws as well as oversampling. If you use integer arithmetic as done here, your game physics should be close to 100% deterministic, no matter how slow or fast the machine. This is the advantage of increasing time at fixed time intervals. The state used for rendering is calculated by interpolating between the previous and current states, where the residual value inside the time accumulator is used as the interpolation coefficient. This ensures that the rendering is smooth, no matter how large the timestamp.

+13
source

In addition to the problems already mentioned (you must use the structure for time and pass it to handle_time (), and your minute will increase every half hour), your solution is great for tracking the time of work in the game.

However, for most game events that should happen so often, you should probably base them on the main cycle of the game, and not on the actual time, so that they occur in the same proportions with different fps.

+1
source

One of the Glenn posts you really want to read is Correct your time interval! . After looking at this link, I noticed that Mads directed you to the same general place in my answer.

0
source

I'm not a Linux developer, but you may need to use timers instead of polling for ticks.

http://linux.die.net/man/2/timer_create

EDIT:
SDL Seem for Timer Support: SDL_SetTimer

-one
source

All Articles