In a utility that I wrote in C ++ for testing purposes, the SendInput function is currently used to emulate user input, i.e. mouse and keyboard SendInput I would like to expand my program so that it can emulate touch events so that I can verify that the processed program is processing, for example. WM_TOUCH messages (although this message is outdated , I donโt care how to check if it is being processed correctly). I do not need to send touch events to a specific HWND , just sending them to some X / Y coordinates on the desktop is enough for my purposes.
I looked at using HARDWAREINPUT structures with SendInput , as well as sending WM_TOUCH (or WM_GESTURE ?) WM_GESTURE directly, but in any case, itโs not clear to me how to pass information about where and how this event was related.
The WM_TOUCH documentation explains that the lParam argument is:
Contains a touch input descriptor that can be used in a GetTouchInputInfo call to obtain detailed information about the tangent points associated with this message.
... but it is not clear how and how I can distinguish such an "input descriptor". Is it always a simple pointer in the address space of the current process? If so, what value should be used for the hSource field of the hSource structure?
c ++ winapi touch multi-touch wm-touch
Frerich raabe
source share