0e3de701d9
10.6), which will forward all multi-touch events from a trackpad to the corresponding osgGA-event-structures. The support is switched off per default, but you can enable multi-touch support via a new flag for GraphicsWindowCocoa::WindowData or directly via the GraphicsWindowCocoa-class. After switching multi-touch-support on, all mouse-events from the trackpad get ignored, otherwise you'll have multiple events for the same pointer which is very confusing (as the trackpad reports absolute movement, and as a mouse relative movement). I think this is not a problem, as multi-touch-input is a completely different beast as a mouse, so you'll have to code your own event-handlers anyway. While coding this stuff, I asked myself if we should refactor GUIEventAdapter/EventQueue and assign a specific event-type for touch-input instead of using PUSH/DRAG/RELEASE. This will make it clearer how to use the code, but will break the mouse-emulation for the first touch-point and with that all existing manipulators. What do you think? I am happy to code the proposed changes. Additionally I created a small (and ugly) example osgmultitouch which makes use of the osgGA::MultiTouchTrackballManipulator, shows all touch-points on a HUD and demonstrates how to get the touchpoints from an osgGA::GUIEventAdapter. There's even a small example video here: http://vimeo.com/31611842" |
||
---|---|---|
.. | ||
CMakeLists.txt | ||
CompositeViewer.cpp | ||
DarwinUtils.h | ||
DarwinUtils.mm | ||
GraphicsWindow.cpp | ||
GraphicsWindowCarbon.cpp | ||
GraphicsWindowCocoa.mm | ||
GraphicsWindowIOS.mm | ||
GraphicsWindowWin32.cpp | ||
GraphicsWindowX11.cpp | ||
HelpHandler.cpp | ||
IOSUtils.h | ||
IOSUtils.mm | ||
PixelBufferCarbon.cpp | ||
PixelBufferCocoa.mm | ||
PixelBufferWin32.cpp | ||
PixelBufferX11.cpp | ||
Renderer.cpp | ||
Scene.cpp | ||
ScreenCaptureHandler.cpp | ||
StatsHandler.cpp | ||
Version.cpp | ||
View.cpp | ||
Viewer.cpp | ||
ViewerBase.cpp | ||
ViewerEventHandlers.cpp |