I definitely got a little lost with the new chrono library in C ++.
Here I have an update loop. It performs two operations:
engine.Update() engine.Render()
These are long operations, and it is difficult to say how long they are.
Thus, we measure how much time they take, then perform some calculations and determine the best way to gradually invoke the update before we call rendering.
For this I use C ++ 11 Chrono. I chose it because it sounded like a good deal: more accurate, more platform dependent. I find that I have more problems than now.
Below is my code as well as my main problem. Any help on any problem or the correct way to perform my operations is very necessary!
I marked my questions in the comments right next to the corresponding lines, which I will repeat further.
Header file:
class MyClass { private: typedef std::chrono::high_resolution_clock Clock; Clock::time_point mLastEndTime; milliseconds mDeltaTime; }
Simplified update cycle
Main question: My mDeltaTime always looks tiny. It is basically stuck in an almost infinite loop. This is due to the fact that kMaxDeltatime is very small, but if I am aiming at 60 frames per second, do I not calculate the correct milliseconds?
Here are all the questions listed above:
const milliseconds kMaxDeltatime((int)((1.0f / 60.0f) * 1000.0f));
I apologize for the confusion. I feel like an idiot with this chrono library. Most help sites, help materials, or even the most direct code are very confusing to read and understand what I'm applying it to. Pointers to how I should look for solutions or code are very welcome!
EDIT: Joachim noted that std :: min / max only works for milliseconds! Updated code to reflect changes.