Why increase CPU usage while minimizing my application?

I am programming a calculator. When the window is maximized, the CPU consumption is about 12%, but when it is minimized, the CPU consumption increases to about 50%. Why is this happening and how can I prevent it? Here is a piece of code that in my opinion is causing the problem.

LRESULT CALLBACK WndProc(HWND hWnd, UINT uMsg, WPARAM wParam, LPARAM lParam) { switch(uMsg) { case WM_ACTIVATE: if(!HIWORD(wParam)) active = true; else active = false; return 0; case WM_SYSCOMMAND: switch(wParam) { case SC_SCREENSAVE: case SC_MONITORPOWER: return 0; } break; case WM_CLOSE: PostQuitMessage(0); return 0; case WM_KEYDOWN: if( (wParam >= VK_LEFT && wParam <= VK_DOWN) || wParam == VK_CONTROL) myCalc.handleInput(wParam, true); return 0; case WM_CHAR: myCalc.handleInput(wParam); return 0; case WM_SIZE: ReSizeGLScene(LOWORD(lParam), HIWORD(lParam)); //LOWORD = Width; HIWORD = Height return 0; } return DefWindowProc(hWnd, uMsg, wParam, lParam); } int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nShowCmd) { MSG msg; if(!CreateGLWindow(WINDOW_CAPTION, WINDOW_WIDTH, WINDOW_HEIGHT, WINDOW_BPP)) { return 0; } while(!done) //Main loop { if(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) { if(msg.message == WM_QUIT) done = true; else { TranslateMessage(&msg); //Translate the message DispatchMessage(&msg); //Dispatch the message } } else { //Start the time handler myTimeHandler.Start(); //Draw the GL Scene if(active) { DrawGLScene(); //Draw the scene SwapBuffers(hDC); //Swap buffer (double buffering) } //Regulate the fps myTimeHandler.RegulateFps(); } } //Shutdown KillGLWindow(); return(msg.wParam); } 
+4
source share
3 answers

I assume your main loop runs without any delay if active is false. The thread rotates endlessly in this cycle and holds one of the two cores of your processor (what you see is 50% of the processor load).

If active true, the swap operation waits for the next vsync and delays your loop until the next screen refresh occurs, which will result in lower CPU utilization. (The time taken to wait for a queue inside the windows function waiting for an event is not taken into account when the processor boots.)

To solve this problem, you can switch to GetMessage based on a message loop at a time when you do not want to display anything.

+6
source

The smaller the area in which the OpenGL window opens, the faster the scene emerges (the key term is filled), and therefore the event loop repeats at a higher frequency. I see that you have some kind of RegulateFps function - for me it sounds like something busy until a certain time is consumed in the renderer. That is, you literally lose CPU time to get ... uhhh, why do you want to save the frame rate in the first place? Get rid of it.

And, of course, if you minimize it, you set active = false , so you don’t do GL at all, but still lose time in the busy cycle.

Try to enable V synchronization in the driver settings and use double buffering, then wglSwapBuffers will be blocked until the vertical workpiece. And if active==false not PeekMessage , but GetMessage

+3
source

Ok I see one of ur logical flows

 while(!done) //Main loop { if(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) { ...... } else { .... rendering. } }
while(!done) //Main loop { if(PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) { ...... } else { .... rendering. } } 

when you minimized, peekMessage will always fail, so you get the rendering part.

which forces you to use a 100% single-core processor, because it does not sleep during the cycle, wait, just do the drawing again and again.

you can specify the minimum time for the rendering frame.

My suggestion:

 // set a timer for wakeup ur process. while (::GetMessage(&msg, NULL, 0, 0)) { // handle the messages // do check the rendering to see if you need to render. if (currentTime - lastDrawTime > MINIMIUM_RENDER_INTERVAL) // rendering. }
// set a timer for wakeup ur process. while (::GetMessage(&msg, NULL, 0, 0)) { // handle the messages // do check the rendering to see if you need to render. if (currentTime - lastDrawTime > MINIMIUM_RENDER_INTERVAL) // rendering. } 
0
source

All Articles