You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apologies for the platform-specific question, but this seems like a good place to find knowledgable folks.
I was looking into extending pynput to get multitouch events from a trackpad on MacOS. I'm following the pattern of the mouse and keyboard listeners, and I feel like I'm close but I hit a stumbling block.
I can get a stream of NSEvent that seem to represent the touches. But in order to get the actual touch data from there (e.g., location on the trackpad), you need to get NSTouch objects. I tried:
One possibility is that the OS won't attach touch events if the application hasn't called the NSView method setAcceptsTouchEvents, which the docs mention. However:
I don't know if it's possible to get an NSView for a command-line python script. I guess I'm hoping I could get a view associated w/ the Terminal window. I've tried NSapplication.sharedApplication().windows() with no luck.
This post suggests that passing Nil as the view param of touchesMatchingPhase_inView_ should work. But we might need a view that is a firstResponder -- which again hits the problem that we don't have a view.
Has anyone gone down this path before?
Thanks!
The text was updated successfully, but these errors were encountered:
Unfortunately, I no longer have access to a macOS system, so I cannot help you. I would very much appreciate your contribution if you manage to get to work however.
Hello,
Apologies for the platform-specific question, but this seems like a good place to find knowledgable folks.
I was looking into extending pynput to get multitouch events from a trackpad on MacOS. I'm following the pattern of the mouse and keyboard listeners, and I feel like I'm close but I hit a stumbling block.
I can get a stream of NSEvent that seem to represent the touches. But in order to get the actual touch data from there (e.g., location on the trackpad), you need to get NSTouch objects. I tried:
I put a complete example here: https://gist.github.com/adamberenzweig/25d71d97ca7c1510fefd5a662791f576
One possibility is that the OS won't attach touch events if the application hasn't called the NSView method
setAcceptsTouchEvents
, which the docs mention. However:NSapplication.sharedApplication().windows()
with no luck.Has anyone gone down this path before?
Thanks!
The text was updated successfully, but these errors were encountered: