The solution for to refactor the way that events are checked so I add a bool return type to checkEvents() method across osgViewer::GraphcisWindow, osgGA::Devive and osgViewer::Viewer/CompositeViewer classes
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
and another problem is:
example osgkeyboard is not work (keys not highlight) if user have 2 keyboard layout native and english and current user layout is native
I try to explain my changes
we need something that is identify key without modifier keys and layout -> this is UnmodifedKey
I think osg must have its own UnmodifiedKeys table. Code must be run same on different platforms. This can de guaranteed by UnmodifiedKeys table.
Mikhail Izmestev helped me. He implemented VirtualKey changes in GraphicsWindowX11"
osgGA. My approach is to bundle all touchpoints into one custom data
structure which is attached to an GUIEventAdapter.
The current approach simulates a moving mouse for the first touch-point,
so basic manipulators do work, sort of.
I created a MultiTouchTrackballManipulator-class, one touch-point does
rotate the view, two touch-points pan and zoom the view as known from
the iphone or other similar multi-touch-devices. A double-tap (similar
to a double-click) resets the manipulator to its home-position.
The multi-touch-trackball-implementation is not the best, see it as a
first starting point. (there's a demo-video at http://vimeo.com/15017377 )"
"attached you'll find some modifications to Producer, osgGA and
osgProducer to enable Mac OS X support for
+ scrollwheels,
+ mightymouse-srollballs
+ new tracking-pads with scroll feature
+ tablet-support (pressure, proximity and pointertype) (Wacom only tested)
I think there was a bug in the windows-implementation of scroll-wheel
support (wrong order of ScrollingMotion-enum, casting problem) which is
fixed now.
The scrollwheel-code is a bit klunky across platforms, some devices on
OS X can report an absolute delta in pixel-coordinates not only the
direction, so for now there is scrollingMotion (which describes the
direction) and scrolldeltax and scrolldeltay. I decided to leave the
scrollingmotion-stuff to not break old code relying on this."