From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
/* OpenSceneGraph example, osghud.
|
|
|
|
*
|
|
|
|
* Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
|
|
* of this software and associated documentation files (the "Software"), to deal
|
|
|
|
* in the Software without restriction, including without limitation the rights
|
|
|
|
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
|
|
* copies of the Software, and to permit persons to whom the Software is
|
|
|
|
* furnished to do so, subject to the following conditions:
|
|
|
|
*
|
|
|
|
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
|
|
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
|
|
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
|
|
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
|
|
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
|
|
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
|
|
|
* THE SOFTWARE.
|
|
|
|
*/
|
|
|
|
|
|
|
|
#include <osgUtil/Optimizer>
|
|
|
|
#include <osgDB/ReadFile>
|
|
|
|
|
|
|
|
#include <osgViewer/Viewer>
|
|
|
|
#include <osgViewer/CompositeViewer>
|
|
|
|
|
|
|
|
#include <osgGA/TrackballManipulator>
|
|
|
|
|
|
|
|
#include <osg/Material>
|
|
|
|
#include <osg/Geode>
|
|
|
|
#include <osg/BlendFunc>
|
|
|
|
#include <osg/Depth>
|
|
|
|
#include <osg/PolygonOffset>
|
|
|
|
#include <osg/MatrixTransform>
|
|
|
|
#include <osg/Camera>
|
|
|
|
#include <osg/RenderInfo>
|
|
|
|
|
|
|
|
#include <osgDB/WriteFile>
|
|
|
|
|
|
|
|
#include <osgText/Text>
|
|
|
|
#include <osgGA/MultiTouchTrackballManipulator>
|
|
|
|
#include <osg/ShapeDrawable>
|
|
|
|
|
|
|
|
#ifdef __APPLE__
|
|
|
|
#include <osgViewer/api/Cocoa/GraphicsWindowCocoa>
|
|
|
|
#endif
|
|
|
|
|
|
|
|
|
|
|
|
osg::Camera* createHUD(unsigned int w, unsigned int h)
|
|
|
|
{
|
|
|
|
// create a camera to set up the projection and model view matrices, and the subgraph to draw in the HUD
|
|
|
|
osg::Camera* camera = new osg::Camera;
|
|
|
|
|
|
|
|
// set the projection matrix
|
|
|
|
camera->setProjectionMatrix(osg::Matrix::ortho2D(0,w,0,h));
|
|
|
|
|
2015-10-22 22:14:53 +08:00
|
|
|
// set the view matrix
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
camera->setReferenceFrame(osg::Transform::ABSOLUTE_RF);
|
|
|
|
camera->setViewMatrix(osg::Matrix::identity());
|
|
|
|
|
|
|
|
// only clear the depth buffer
|
|
|
|
camera->setClearMask(GL_DEPTH_BUFFER_BIT);
|
|
|
|
|
|
|
|
// draw subgraph after main camera view.
|
|
|
|
camera->setRenderOrder(osg::Camera::POST_RENDER);
|
|
|
|
|
|
|
|
// we don't want the camera to grab event focus from the viewers main camera(s).
|
|
|
|
camera->setAllowEventFocus(false);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
|
|
|
|
|
|
|
|
// add to this camera a subgraph to render
|
|
|
|
{
|
|
|
|
|
|
|
|
osg::Geode* geode = new osg::Geode();
|
|
|
|
|
|
|
|
std::string timesFont("fonts/arial.ttf");
|
|
|
|
|
|
|
|
// turn lighting off for the text and disable depth test to ensure it's always ontop.
|
|
|
|
osg::StateSet* stateset = geode->getOrCreateStateSet();
|
|
|
|
stateset->setMode(GL_LIGHTING,osg::StateAttribute::OFF);
|
|
|
|
|
|
|
|
osg::Vec3 position(50.0f,h-50,0.0f);
|
|
|
|
|
|
|
|
{
|
|
|
|
osgText::Text* text = new osgText::Text;
|
|
|
|
geode->addDrawable( text );
|
|
|
|
|
|
|
|
text->setFont(timesFont);
|
|
|
|
text->setPosition(position);
|
|
|
|
text->setText("A simple multi-touch-example\n1 touch = rotate, \n2 touches = drag + scale, \n3 touches = home");
|
2015-10-22 22:14:53 +08:00
|
|
|
}
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
|
|
|
|
camera->addChild(geode);
|
|
|
|
}
|
|
|
|
|
|
|
|
return camera;
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
|
|
class TestMultiTouchEventHandler : public osgGA::GUIEventHandler {
|
|
|
|
public:
|
2013-11-18 21:25:55 +08:00
|
|
|
TestMultiTouchEventHandler(osg::Group* parent_group, float w, float h)
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
: osgGA::GUIEventHandler(),
|
2013-11-18 21:25:55 +08:00
|
|
|
_cleanupOnNextFrame(false),
|
|
|
|
_w(w),
|
|
|
|
_h(h)
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
{
|
|
|
|
createTouchRepresentations(parent_group, 10);
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
private:
|
2015-10-22 22:14:53 +08:00
|
|
|
void createTouchRepresentations(osg::Group* parent_group, unsigned int num_objects)
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
{
|
|
|
|
// create some geometry which is shown for every touch-point
|
2015-10-22 22:14:53 +08:00
|
|
|
for(unsigned int i = 0; i != num_objects; ++i)
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
{
|
|
|
|
std::ostringstream ss;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
osg::Geode* geode = new osg::Geode();
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
osg::ShapeDrawable* drawable = new osg::ShapeDrawable(new osg::Box(osg::Vec3(0,0,0), 100));
|
|
|
|
drawable->setColor(osg::Vec4(0.5, 0.5, 0.5,1));
|
|
|
|
geode->addDrawable(drawable);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
ss << "Touch " << i;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
osgText::Text* text = new osgText::Text;
|
|
|
|
geode->addDrawable( text );
|
|
|
|
drawable->setDataVariance(osg::Object::DYNAMIC);
|
|
|
|
_drawables.push_back(drawable);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
text->setFont("fonts/arial.ttf");
|
|
|
|
text->setPosition(osg::Vec3(110,0,0));
|
|
|
|
text->setText(ss.str());
|
|
|
|
_texts.push_back(text);
|
|
|
|
text->setDataVariance(osg::Object::DYNAMIC);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
|
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
osg::MatrixTransform* mat = new osg::MatrixTransform();
|
|
|
|
mat->addChild(geode);
|
|
|
|
mat->setNodeMask(0x0);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
_mats.push_back(mat);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
parent_group->addChild(mat);
|
2015-10-22 22:14:53 +08:00
|
|
|
}
|
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
parent_group->getOrCreateStateSet()->setMode(GL_LIGHTING, osg::StateAttribute::OFF);
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
virtual bool handle (const osgGA::GUIEventAdapter &ea, osgGA::GUIActionAdapter &aa, osg::Object *, osg::NodeVisitor *)
|
|
|
|
{
|
2013-11-18 21:25:55 +08:00
|
|
|
if (ea.getEventType() != osgGA::GUIEventAdapter::FRAME) {
|
|
|
|
std::cout << ea.getTime() << ": ";
|
|
|
|
switch(ea.getEventType()) {
|
|
|
|
case osgGA::GUIEventAdapter::PUSH: std::cout << "PUSH"; break;
|
|
|
|
case osgGA::GUIEventAdapter::DRAG: std::cout << "DRAG"; break;
|
|
|
|
case osgGA::GUIEventAdapter::MOVE: std::cout << "MOVE"; break;
|
|
|
|
case osgGA::GUIEventAdapter::RELEASE: std::cout << "RELEASE"; break;
|
|
|
|
default: std::cout << ea.getEventType();
|
|
|
|
}
|
|
|
|
std::cout << std::endl;
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
switch(ea.getEventType())
|
|
|
|
{
|
|
|
|
case osgGA::GUIEventAdapter::FRAME:
|
|
|
|
if (_cleanupOnNextFrame) {
|
|
|
|
cleanup(0);
|
|
|
|
_cleanupOnNextFrame = false;
|
|
|
|
}
|
|
|
|
break;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
case osgGA::GUIEventAdapter::PUSH:
|
|
|
|
case osgGA::GUIEventAdapter::DRAG:
|
|
|
|
case osgGA::GUIEventAdapter::RELEASE:
|
|
|
|
{
|
|
|
|
// is this a multi-touch event?
|
|
|
|
if (!ea.isMultiTouchEvent())
|
|
|
|
return false;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
unsigned int j(0);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
// iterate over all touch-points and update the geometry
|
|
|
|
unsigned num_touch_ended(0);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
for(osgGA::GUIEventAdapter::TouchData::iterator i = ea.getTouchData()->begin(); i != ea.getTouchData()->end(); ++i, ++j)
|
|
|
|
{
|
|
|
|
const osgGA::GUIEventAdapter::TouchData::TouchPoint& tp = (*i);
|
2013-11-18 21:25:55 +08:00
|
|
|
float x = ea.getTouchPointNormalizedX(j);
|
|
|
|
float y = ea.getTouchPointNormalizedY(j);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
2013-11-18 21:25:55 +08:00
|
|
|
// std::cout << j << ": " << tp.x << "/" << tp.y <<" "<< x << " " << y << " " << _w << " " << _h << std::endl;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
2013-11-18 21:25:55 +08:00
|
|
|
_mats[j]->setMatrix(osg::Matrix::translate((1+x) * 0.5 * _w, (1+y) * 0.5 * _h, 0));
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
_mats[j]->setNodeMask(0xffff);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
std::ostringstream ss;
|
|
|
|
ss << "Touch " << tp.id;
|
|
|
|
_texts[j]->setText(ss.str());
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
switch (tp.phase)
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
{
|
|
|
|
case osgGA::GUIEventAdapter::TOUCH_BEGAN:
|
|
|
|
_drawables[j]->setColor(osg::Vec4(0,1,0,1));
|
|
|
|
break;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
case osgGA::GUIEventAdapter::TOUCH_MOVED:
|
|
|
|
_drawables[j]->setColor(osg::Vec4(1,1,1,1));
|
|
|
|
break;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
case osgGA::GUIEventAdapter::TOUCH_ENDED:
|
|
|
|
_drawables[j]->setColor(osg::Vec4(1,0,0,1));
|
|
|
|
++num_touch_ended;
|
|
|
|
break;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
case osgGA::GUIEventAdapter::TOUCH_STATIONERY:
|
|
|
|
_drawables[j]->setColor(osg::Vec4(0.8,0.8,0.8,1));
|
|
|
|
break;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
default:
|
|
|
|
break;
|
|
|
|
|
|
|
|
}
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
// hide unused geometry
|
|
|
|
cleanup(j);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
//check if all touches ended
|
|
|
|
if ((ea.getTouchData()->getNumTouchPoints() > 0) && (ea.getTouchData()->getNumTouchPoints() == num_touch_ended))
|
|
|
|
{
|
|
|
|
_cleanupOnNextFrame = true;
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
// reposition mouse-pointer
|
|
|
|
aa.requestWarpPointer((ea.getWindowX() + ea.getWindowWidth()) / 2.0, (ea.getWindowY() + ea.getWindowHeight()) / 2.0);
|
|
|
|
}
|
|
|
|
break;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
default:
|
|
|
|
break;
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
return false;
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
void cleanup(unsigned int j)
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
{
|
|
|
|
for(unsigned k = j; k < _mats.size(); ++k) {
|
|
|
|
_mats[k]->setNodeMask(0x0);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
std::vector<osg::ShapeDrawable*> _drawables;
|
|
|
|
std::vector<osg::MatrixTransform*> _mats;
|
|
|
|
std::vector<osgText::Text*> _texts;
|
|
|
|
bool _cleanupOnNextFrame;
|
2015-10-22 22:14:53 +08:00
|
|
|
|
2013-11-18 21:25:55 +08:00
|
|
|
float _w, _h;
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
|
|
int main( int argc, char **argv )
|
|
|
|
{
|
|
|
|
// use an ArgumentParser object to manage the program arguments.
|
|
|
|
osg::ArgumentParser arguments(&argc,argv);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
|
2013-11-18 21:25:55 +08:00
|
|
|
unsigned int helpType = 0;
|
|
|
|
if ((helpType = arguments.readHelpType()))
|
|
|
|
{
|
|
|
|
arguments.getApplicationUsage()->write(std::cout, helpType);
|
|
|
|
return 1;
|
|
|
|
}
|
|
|
|
|
|
|
|
// report any errors if they have occurred when parsing the program arguments.
|
|
|
|
if (arguments.errors())
|
|
|
|
{
|
|
|
|
arguments.writeErrorMessages(std::cout);
|
|
|
|
return 1;
|
|
|
|
}
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
|
|
|
|
// read the scene from the list of file specified commandline args.
|
2015-10-22 22:14:53 +08:00
|
|
|
osg::ref_ptr<osg::Node> scene = osgDB::readRefNodeFiles(arguments);
|
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
// if not loaded assume no arguments passed in, try use default model instead.
|
2015-10-22 22:14:53 +08:00
|
|
|
if (!scene) scene = osgDB::readRefNodeFile("dumptruck.osgt");
|
|
|
|
|
|
|
|
if (!scene)
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
{
|
|
|
|
osg::Geode* geode = new osg::Geode();
|
|
|
|
osg::ShapeDrawable* drawable = new osg::ShapeDrawable(new osg::Box(osg::Vec3(0,0,0), 100));
|
|
|
|
drawable->setColor(osg::Vec4(0.5, 0.5, 0.5,1));
|
|
|
|
geode->addDrawable(drawable);
|
|
|
|
scene = geode;
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
|
|
// construct the viewer.
|
2013-11-18 21:25:55 +08:00
|
|
|
osgViewer::Viewer viewer(arguments);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
|
2013-11-18 21:25:55 +08:00
|
|
|
//opening devices
|
|
|
|
std::string device;
|
|
|
|
while(arguments.read("--device", device))
|
|
|
|
{
|
2015-10-22 22:14:53 +08:00
|
|
|
osg::ref_ptr<osgGA::Device> dev = osgDB::readRefFile<osgGA::Device>(device);
|
2013-11-18 21:25:55 +08:00
|
|
|
if (dev.valid())
|
|
|
|
{
|
2015-10-22 22:14:53 +08:00
|
|
|
viewer.addDevice(dev);
|
2013-11-18 21:25:55 +08:00
|
|
|
}
|
|
|
|
}
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
osg::ref_ptr<osg::Group> group = new osg::Group;
|
|
|
|
|
2015-10-22 22:14:53 +08:00
|
|
|
// add the HUD subgraph.
|
|
|
|
if (scene) group->addChild(scene);
|
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
viewer.setCameraManipulator(new osgGA::MultiTouchTrackballManipulator());
|
|
|
|
viewer.realize();
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
osg::GraphicsContext* gc = viewer.getCamera()->getGraphicsContext();
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
#ifdef __APPLE__
|
|
|
|
// as multitouch is disabled by default, enable it now
|
|
|
|
osgViewer::GraphicsWindowCocoa* win = dynamic_cast<osgViewer::GraphicsWindowCocoa*>(gc);
|
|
|
|
if (win) win->setMultiTouchEnabled(true);
|
|
|
|
#endif
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
std::cout << "creating hud with " << gc->getTraits()->width << "x" << gc->getTraits()->height << std::endl;
|
|
|
|
osg::Camera* hud_camera = createHUD(gc->getTraits()->width, gc->getTraits()->height);
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
|
2013-11-18 21:25:55 +08:00
|
|
|
viewer.addEventHandler(new TestMultiTouchEventHandler(hud_camera, gc->getTraits()->width, gc->getTraits()->height));
|
2015-10-22 22:14:53 +08:00
|
|
|
|
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
group->addChild(hud_camera);
|
|
|
|
|
|
|
|
// set the scene to render
|
|
|
|
viewer.setSceneData(group.get());
|
|
|
|
|
|
|
|
return viewer.run();
|
|
|
|
|
2015-10-22 22:14:53 +08:00
|
|
|
|
From Stephan Huber, "attached you'll find a first version of multi-touch-support for OS X (>=
10.6), which will forward all multi-touch events from a trackpad to the
corresponding osgGA-event-structures.
The support is switched off per default, but you can enable multi-touch
support via a new flag for GraphicsWindowCocoa::WindowData or directly
via the GraphicsWindowCocoa-class.
After switching multi-touch-support on, all mouse-events from the
trackpad get ignored, otherwise you'll have multiple events for the same
pointer which is very confusing (as the trackpad reports absolute
movement, and as a mouse relative movement).
I think this is not a problem, as multi-touch-input is a completely
different beast as a mouse, so you'll have to code your own
event-handlers anyway.
While coding this stuff, I asked myself if we should refactor
GUIEventAdapter/EventQueue and assign a specific event-type for
touch-input instead of using PUSH/DRAG/RELEASE. This will make it
clearer how to use the code, but will break the mouse-emulation for the
first touch-point and with that all existing manipulators. What do you
think? I am happy to code the proposed changes.
Additionally I created a small (and ugly) example osgmultitouch which
makes use of the osgGA::MultiTouchTrackballManipulator, shows all
touch-points on a HUD and demonstrates how to get the touchpoints from
an osgGA::GUIEventAdapter.
There's even a small example video here: http://vimeo.com/31611842"
2012-02-03 22:25:08 +08:00
|
|
|
}
|