I fixed some bugs and did some more tests with both of the video-plugins. I integrated CoreVideo with osgPresentation, ImageStream has a new virtual method called createSuitableTexture which returns NULL for default implementations. Specialized implementations like the QTKit-plugin return a CoreVideo-texture. I refactored the code in SlideShowConstructor::createTexturedQuad to use a texture returned from ImageStream::createSuitableTexture.
I did not use osgDB::readObjectFile to get the texture-object, as a lot of image-related code in SlideShowConstructor had to be refactored to use a texture. My changes are minimal and should not break existing code.
There's one minor issue with CoreVideo in general: As the implementation is asynchronous, there might be no texture available, when first showing the video the first frame. I am a bit unsure how to tackle this problem, any input on this is appreciated.
Back to the AVFoundation-plugin: the current implementation does not support CoreVideo as the QTKit-plugin supports it. There's no way to get decoded frames from AVFoundation stored on the GPU, which is kind of sad. I added some support for CoreVideo to transfer decoded frames back to the GPU, but in my testings the performance was worse than using the normal approach using glTexSubImage. This is why I disabled CoreVideo for AVFoundation. You can still request a CoreVideoTexture via readObjectFile, though.
"
I don’t have access to an OSX or Linux dev machine to make the changes required to the quick time plugin. This plugin will just default to returning 0."
"enable thread locking in libavcodec
This is required for a multithreaded application using ffmpeg from
another thread."
"Prevent the audio from videos from hanging on exit if they are paused.
The video decoder already has similar logic."
"Add a way to retrieve the creation time for MPEG-4 files."
"fmpeg, improve wait for close logic
Both audio and video destructors have been succesfully using the logic,
if(isRunning())
{
m_exit = true;
join();
}
since it was introduced,
but the close routines are using,
m_exit = true;
if(isRunning() && waitForThreadToExit)
{
while(isRunning()) { OpenThreads::Thread::YieldCurrentThread(); }
}
which not only is it doing an unnecessary busy wait, but it doesn't
guaranteed that the other thread has terminated, just that it has
progressed far enough that OpenThreads has set the thread status as
not running. Like the destructor set the m_exit after checking
isRunning() to avoid the race condition of not getting to join()
because the thread was running, but isRunning() returns false.
Now that FFmpeg*close is fixed, call it from the destructor as well
to have that code in only one location."
return the length of the stream.
Implemented the virtual methods in QuicktimeImageStream, (getLength,
getReferenceTime, setTimeMultiplier), to return valid value for each.
"