In osg::isGLExtensionOrVersionSupported in src/osg/GLExtensions.cpp when
using indirect X11 rendering,
glGetIntegerv( GL_NUM_EXTENSIONS, &numExt );
is leaving numExt uninitilized causing the following glGetStringi to
return NULL when the extension number isn't present. Passing NULL to
std::string() then crashes. This is with the following nVidia driver.
OpenGL version string: 3.3.0 NVIDIA 256.35
I went ahead and initialized some of the other variables before
glGetInitegerv in other files as well. I don't know for sure
which ones can fail, so I don't know which are strictly required.
"
examples\osgautocapture\osgautocapture.cpp
-fixed a bug with --active command line option not rendering
-added --pbuffer command line option
-changed very confusing #ifdef 0
-added OSG_GLES GL_RGB readPixels support if available (UNTESTED)"