CG 3.0 leak?

I find that CG seems to have a memory leak. I sent a report via nvidia.com, but if you try this one here :

If you delete the line that says

cgD3D11SetTextureParameter( g.theTexture, g.sharedTex ) ; 

The leak stops.

Is CG 3.0 really leaking?

Using 64-bit 64-bit ATI Radeon 5850 / Windows 7 processors.

+4
source share
3 answers

Yes, it is flowing. Inside, it creates a ShaderResourceView for each call and never releases it. I think the API is poorly designed, they should have used ShaderResourceView * as a parameter for this function, and not just for the resource *.

I posted this on nvidia forums about 6 months ago and did not receive a response

Is your report published publicly? Or some sort of private support ticket?

+3
source

Yes, Cg 3.0 leaks every time you call cgD3D11SetTextureParameter (), forcing you to use your application memory. Unfortunately, Cg 3.0 with D3D11 is completely unusable. One symptom of this is that after a while your application will stop rendering, and the screen will be just black. I spent a lot of time trying to determine the cause of this before finding a Cg error.

If someone wonders why this is not obvious in the Cg D3D11 demos, it is because the few that actually use textures are so simple that they can only leave cgD3D11SetTextureParameter () with a call once in the beginning.

The same error remains with Cg Toolkit 3.1 (April 2012).

+1
source

jmp [UPDATE] ;; skip obsolete text segment

Could it be that Cg is destroyed after d3d, so it does not release the link in time? Or vice versa? such as a function that acquires a texture but does not release it until d3d closes, because when you install a texture in a shader, the texture is acquired until the shader resources are released in any way. You are breaking the d3d context, here: SAFE_RELEASE (g.d3d); SAFE_RELEASE (g.gpu); Later you release the shader as follows CleanupCg (): cgDestroyProgram (g.v_vncShader); checkForCgError ("destroy vertex program"); cgDestroyProgram (g.px_vncShader); checkForCgError ("destroy fragment program"); Try changing the order of the calls so that you first release all resources from both cg and d3d, these are: cgD3D11SetDevice( g.cgContext, NULL ); you also need to call d3d before releasing the context just in case.

UPDATE:

This should be different inside WinMain() :

 initD3D11() ; // << FIRST you init D3D initCg() ; // << SECOND you init CG with the D3D pointers initD2D1() ; // initVBs() ; // Main message loop while( WM_QUIT != msg.message ){ /* loop code */ } CleanupDevice(); //// << FIRST you release all D3D, when Cg is still referencing it (why?). CleanupCg(); //// << SECOND if something in the Cg runtime depend on d3dcontext which you just destroyed, it will crash or leak or do whatever it wants 

therefore, you must swap them to ensure that Cg releases any d3d pointer:

 CleanupCg(); //// << FIRST release Cg to ensure it not referencing D3D anymore. CleanupDevice(); //// << SECOND D3D isn't either referencing or being referenced by Cg, so just release it all 

You can also provide debugger output and other information, as I asked there, because you basically say: "Cg seems to be broken, this is all the code, look at the line ###, is it broken?" but your file contains more than a thousand lines of (1012) C, C ++ and shader code, you basically do not provide information, but easily point out a Cg error (based on ... what?), which, of course, is so sure why would anyone look at the code if the code is ok? This, by the way, is not that I don’t like it, but ... he has such little things as arranging calls, which are stupid mistakes, but which can debug real hell, this is a clear mistake, and I might also think that if I just looked at Main and found an error, well there is a long way to calling rendering and implementing Cg, right? I can not run the application on WinXP, but these errors are in the most predictable places :)

So ... when your code is clear of any error ... oh! look! what i just found ..

 ~VertexBuffer() { SAFE_RELEASE( vb ); SAFE_RELEASE( layout ) ; } 

it turns out in the VertexBuffer constructor, which you call iD3D->GetImmediateContext( &gpu ); and keep the pointer in a private member, so ... you should not add:

 SAFE_RELEASE( gpu ); // ? there are 3 VertexBuffers instances, so that another memory leak. 

So, there are some things that you need to fix in your code that cause a memory leak, and I just looked at it, so you really haven't tried. On the other hand, it seems that your code is clear and full of explanations, and I need to learn DX11, so I have to thank you for that. The shift was somewhat rude: P on purpose, because I'm probably right, and other people avoid reading your code as soon as the page appears.

-1
source

All Articles