The iTunes mini-player (to give just one example) supports switching between clicks when the application is not brought to the fore when using the play / pause and volume controls.
How it's done?
I was looking through Apple documentation and continuing a bit, Cocoa Event Handling Guide, dispatching an event, he claims:
Some events, many of which are defined by a set of applications (type NSAppKitDefined), are associated with actions controlled by the window or the application object itself. Examples of these events are those associated with activating, deactivating, hiding and showing the application. NSApp filters these events early on in its dispatching process and processes them independently.
So, from my limited understanding ( How an event enters Cocoa, the app ), subclassic NSApplication and overriding - (void)sendEvent:(NSEvent *)theEventshould catch every mouse and keyboard event, but still the window gets bigger
when clicked. Thus, either the window appears before the event is seen in NSApplication, or I missed something else.
I looked at Matt Gallagher Demystifying NSApplication, recreating it , unfortunately, Matt did not close the queue of events, so other than that, I'm at a standstill.
Any help would be appreciated, thanks.
Edited to add: Found a message in the Lloyd Lounge in which he talks about the same problem and links to the message in CocoaBuilder, first remove the right mouse button . I'm currently trying to execute the code provided there, after some messing around and activating NSLog for [theEvent type], the activity of the left mouse button is activated.
, , , , 13, 1, 13, NSAppKitDefined, NSLeftMouseDown NSAppKitDefined . , ?