I have a non-process COM server written in C ++ that is called by some C # client code. The method on one of the server interfaces returns a large BSTR to the client, and I suspect this is causing a memory leak. The code works, but I'm looking for help with marshalling BSTRs.
Simplifying the bit, the IDL method for the server
HRESULT ProcessRequest([in] BSTR request, [out] BSTR* pResponse);
and the implementation is as follows:
HRESULT MyClass::ProcessRequest(BSTR request, BSTR* pResponse) { USES_CONVERSION; char* pszRequest = OLE2A(request); char* pszResponse = BuildResponse(pszRequest); delete pszRequest; *pResponse = A2BSTR(pszResponse); delete pszResponse; return S_OK; }
A2BSTR internally allocates BSTR using SysAllocStringLen ().
In the C # client, I just do the following:
string request = "something"; string response = ""; myserver.ProcessRequest(request, out response); DoSomething(response);
This works, in this request, the lines are sent to the COM server, and the correct response lines are returned to the C # client. But every turn to server memory leaks in the server process. Crt leak detection support does not show significant leaks in the crt heap, so I suspect that the leak was allocated by IMalloc.
Am I doing something wrong here? I found vague comments that "all parameters should be allocated using CoTaskMemAlloc, otherwise the firewall will not release them," but no details.
Andy
c ++ c # interop com atl
Andy johnson
source share