85bce8b8ad
10.6), which will forward all multi-touch events from a trackpad to the corresponding osgGA-event-structures. The support is switched off per default, but you can enable multi-touch support via a new flag for GraphicsWindowCocoa::WindowData or directly via the GraphicsWindowCocoa-class. After switching multi-touch-support on, all mouse-events from the trackpad get ignored, otherwise you'll have multiple events for the same pointer which is very confusing (as the trackpad reports absolute movement, and as a mouse relative movement). I think this is not a problem, as multi-touch-input is a completely different beast as a mouse, so you'll have to code your own event-handlers anyway. While coding this stuff, I asked myself if we should refactor GUIEventAdapter/EventQueue and assign a specific event-type for touch-input instead of using PUSH/DRAG/RELEASE. This will make it clearer how to use the code, but will break the mouse-emulation for the first touch-point and with that all existing manipulators. What do you think? I am happy to code the proposed changes. Additionally I created a small (and ugly) example osgmultitouch which makes use of the osgGA::MultiTouchTrackballManipulator, shows all touch-points on a HUD and demonstrates how to get the touchpoints from an osgGA::GUIEventAdapter. There's even a small example video here: http://vimeo.com/31611842" |
||
---|---|---|
.. | ||
api | ||
CompositeViewer | ||
Export | ||
GraphicsWindow | ||
Renderer | ||
Scene | ||
Version | ||
View | ||
Viewer | ||
ViewerBase | ||
ViewerEventHandlers |