aboutsummaryrefslogtreecommitdiffstats
path: root/src/jogl/native
Commit message (Collapse)AuthorAgeFilesLines
* Bug 1239: Fix comment in c-code 'how to set preferred pixel_format'Sven Gothel2015-10-051-1/+1
|
* Bug 1239: Support OSX input via 'avfoundation' ; Use remaining camera ID ↵Sven Gothel2015-10-051-0/+5
| | | | (index) as filename for OSX
* Bug 1239: Add support for UYVY422 (swizzled YUYV422)Sven Gothel2015-10-051-2/+4
|
* Bug 1207 - GLDebugMessageHandler: Support GL_KHR_debug for Desktop and ES ↵Sven Gothel2015-08-301-2/+5
| | | | | | | | | | | | | | | | profile GL_KHR_debug <https://www.opengl.org/registry/specs/KHR/debug.txt> GL_KHR_debug shall be favorized before - GL_ARB_debug_output - GL_AMD_debug_output Allow GL_KHR_debug for GL2GL3 and GL2ES2 profiles, i.e. including ES profiles: GLES2, GLES3. GL_ARB_debug_output and GL_AMD_debug_output are only allowed for desktop GL2GL3 profiles.
* openmax: fix compile errors (clang)Sven Gothel2015-07-162-1/+2
|
* Bug 1135 - Cleanup: Fix native code WarningSven Gothel2015-03-062-6/+7
|
* FFMPEGMediaPlayer: Add support for libav-11 and ffmpeg 2.[4-x] (Latest ↵Sven Gothel2015-02-051-0/+33
| | | | release, used in Debian 8, etc)
* Fix FFMPEGMediaPlayer: static init block issue, libavresample debian8 packagingSven Gothel2015-02-051-3/+4
| | | | | | | | | | | | - static init block issue commit 06a05d30fc026b21f59310986ea9eb7f3ff30d54 used a static final field initialized after the static {} block which was still null if called -> moved above static {} - libavresample debian8 packaging Debian8 packages a libav10 combination w/ libavresample version 2, which actually belongs to libav11 - libav10 uses libarvresample version 1. Allow libavresample and libswresample to be selectively skipped if version mismatch.
* Bug 1096 - Add missing EGLContext.c native codeSven Gothel2015-01-231-0/+39
| | | | As required for commit d0676451343e826e49d9c5732320f080d4c11c8d
* Bug 1087: Set default framebuffer for OSX DummyDrawable, hence enforce ↵Sven Gothel2014-10-081-1/+1
| | | | NSView realization for DummyDrawable
* Fix Bug 1019 - Remedy of Bug 691 causes 'access/modify after free' and ↵Sven Gothel2014-06-121-27/+2
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | crashes the app The 'magic' MyNSOpenGLContext::dealloc (MacOSXWindowSystemInterface-calayer.m) of force destroying the underlying CGLContextObj of it's associated NSOpenGLContext as introduced as a remedy of Bug 691 is plain wrong. It was added in commit f6e6fab2a7ddfb5c9b614cb072c27ff697629161 to mitigate the experience behavior of delayed GL context destruction when creating/destroying them multiple times as exposed in unit test TestGLCanvasAddRemove01SwingAWT. While this 'hack' worked for some reason on some OSX versions, it caused a 'access/modify after free' issue exposed under some circumstances and crashes the application. The actual culprit of the delayed GL context destruction is different. The offthread CALayer detachment and hence final destruction issued on the main-thread is _not_ issued immediately due to some referencing holding by NSApp. Issuing an empty event on the NSApp (thread) will wake up the thread and release claimed resources. This has been found while realizing that the GL context are released if the mouse is being moved (duh!). This issue is also known when triggering stop on the NSApp (NEWT MainThread), same remedy has been implemented here for a long time.
* Bug 1011 / Bug 1012: GLMediaPlayer Audio/Video stuttering w/ OSX and ↵Sven Gothel2014-06-111-1/+5
| | | | OpenAL/JOAL (works using openal-soft default on all platforms now)
* Bug 741 HiDPI: Add ScalableSurface interface to get/set pixelScale w/ full ↵Sven Gothel2014-06-081-4/+8
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | OSX impl. Add ScalableSurface interface - To set pixelScale before and after realization - To get pixelScale - Implemented on: - NEWT Window - Generic impl. in WindowImpl - OSX WindowDriver impl. - Also propagetes pixelScale to parent JAWTWindow if offscreen (NewtCanvasAWT) - AWT WindowDriver impl. - JAWTWindow / OSXCalayer - AWT GLCanvas - AWT GLJPanel - NEWTCanvasAWT: - Propagates NEWT Window's pixelScale to underlying JAWTWindow - WrappedSurface for pixelScale propagation using offscreen drawables, i.e. GLJPanel - Generic helper in SurfaceScaleUtils (nativewindow package) - Fully implemented on OSX - Capable to switch pixelScale before realization, i.e. native-creation, as well as on-the-fly. - Impl. uses int[2] for pixelScale to support non-uniform scale. Test cases: - com.jogamp.opengl.test.junit.jogl.demos.es2.newt.TestGearsES2NEWT - com.jogamp.opengl.test.junit.jogl.demos.es2.awt.TestGearsES2AWT - com.jogamp.opengl.test.junit.jogl.demos.es2.awt.TestGearsES2GLJPanelAWT - com.jogamp.opengl.test.junit.jogl.demos.es2.newt.TestGearsES2NewtCanvasAWT - Press 'x' to toggle HiDPI - Commandline '-pixelScale <value>' - Added basic auto unit test (setting pre-realization)
* Bug 742 HiDPI: [Core API Change] Distinguish window-units and pixel-units: ↵Sven Gothel2014-05-231-1/+1
| | | | | | | | | | | | | | | Refine commit fb57c652fee6be133990cd7afbbd2fdfc084afaa - NEWT Screen, Monitor, MonitorMode, .. - All Units are in pixel units, not window units! - On OSX HiDPI, we report the current scaled monitor resolution, instead of the native pixel sized. Need to filter out those, i.e. report only native unscaled resolutions, since out MonitorMode analogy is per MonitorDevice and not per window! - Fix usage (one by one) of - Screen and Monitor viewport usage
* Bug 742 HiDPI: [Core API Change] Distinguish window-units and pixel-units; ↵Sven Gothel2014-05-212-19/+45
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Add HiDPI for AWT GLCanvas w/ OSX CALayer Core API Change: To support HiDPI thoroughly in JOGL (NativeWindow, JOGL, NEWT) we need to separate window- and pixel units. NativeWindow and NativeSurface now have distinguished access methods for window units and pixel units. NativeWindow: Using window units - getWindowWidth() * NEW Method * - getWindowHeight() * NEW Method * - getX(), getY(), ... NativeSurface: Using pixel units - getWidth() -> getSurfaceWidth() * RENAMED * - getHeight() -> getSurfaceHeight() * RENAMED * GLDrawable: Using pixel units - getWidth() -> getSurfaceWidth() * RENAMED, aligned w/ NativeSurface * - getHeight() -> getSurfaceHeight() * RENAMED, aligned w/ NativeSurface * Above changes also removes API collision w/ other windowing TK, e.g. AWT's getWidth()/getHeight() in GLCanvas and the same method names in GLDrawable before this change. +++ Now preliminary 'working': - AWT GLCanvas - AWT GLJPanel Tested manually on OSX w/ and w/o HiDPI Retina: java com.jogamp.opengl.test.junit.jogl.demos.es2.awt.TestGearsES2AWT -manual -noanim -time 1000000 java com.jogamp.opengl.test.junit.jogl.demos.es2.awt.TestGearsES2GLJPanelAWT -manual -noanim -time 1000000 +++ TODO: - NEWT - Change Window.setSize(..) to use pixel units ? - OSX HiDPI support - Testing .. - API refinement
* FFMPEGMediaPlayer / FFMPEGv10Natives: Fix libav-10 and ffmpeg-2.x version ↵Sven Gothel2014-05-091-0/+0
| | | | validation (libavutil)
* Bug 927 - Multithreading (MT) issues libav/ffmpegSven Gothel2014-02-222-65/+71
| | | | | | | | | | | | | | FFMPEG Natives: - Move 'mutex_avcodec_openclose' to local static and initialize at initSymbols0 - setStream0: - Add another locked mutex block around: - [ sp_avformat_open_input .. sp_avformat_find_stream_info ] This solves the issue of: [NULL @ 0x89d20c60] insufficient thread locking around avcodec_open/close()
* Bug 927: Fix minor MT issues w/ libav/ffmpegSven Gothel2014-02-162-20/+74
| | | | | | | | | | | | | Issue: [NULL @ 0x35bde60] insufficient thread locking around avcodec_open/close() Decorating said libav functions w/ mutex lock/release. Abstract impl. to either use pthread or JNI Monitor, but using the latter to reduce dependencies (ming64 windows). FFMPEGNatives is now an abstract class containing the 'static final Object mutex_avcodec_openclose'
* [Jogl|Nativewindow|Newt]Common: Align all ↵Sven Gothel2014-01-117-143/+116
| | | | | | | | | | | *Common_GetJNIEnv()/_ReleaseJNIEnv() Methods and Usage / Check arguments .. Since we still don't use inter-module native code sharing, align the JNIEnv get/release methods and usage. Most beneficary here is OSX and the GLDebugMessageHandle, both managed the JVM handle on their own - removed now. Also ensuring that *Common_init(..) is called for all modules on all platforms.
* Bug 918 (2/2): Determine StreamWorker usage after stream-init ; Fix seek(..) ↵Sven Gothel2013-12-111-5/+11
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | ; Fallback for EOS Detection ; MovieSimple uses full GLEventListener for 'Audio Only' as well to test seek Determine StreamWorker usage after init - To support audio only files, we need to determine to use StreamWorker after completion of stream-init. Fix seek(..) - FFMPeg: pos0 needs to use aPTS for audio-only - Clip target time [0..duration[ Fallback for EOS Detection In case the backend does not report proper EOS: - Utilize 'nullFramesCount >= MAX' -> EOS, where MAX is number of frames for 3s play duraction and where 'nullFramesCount' is increased if no valid packet is available and no decoded-video or -audio in the queue. - Utilize pts > duration -> EOS MovieSimple uses full GLEventListener for 'Audio Only' as well to test seek - Matroska seek for audio-only leads to EOS .. http://video.webmfiles.org/big-buck-bunny_trailer.webm - MP4 audio-only seek works http://download.blender.org/peach/bigbuckbunny_movies/BigBuckBunny_320x180.mp4 MovieSimple/MovieCube: - Use audio-pts in audio-only to calc target time Tested: - A, V and A+V - Pause, Stop and Seek - GNU/Linux
* Bug 918 (1/2): Use StreamWorker in 'Audio Only' mode, since no ↵Sven Gothel2013-12-111-6/+1
| | | | | | | | 'getNextTexture(..)' is issued here! Thanks to Xerxes to analyze this issue thoroughly. TODO: Implement EOS for 'Audio Only' and test seek, pause, etc .. - Apply manual tests in MovieSimple
* JNI Code: Call DeleteLocalRef(..) manually.Sven Gothel2013-11-051-0/+1
|
* Bug 871 - Add optional xcode.clang support for all modules (Extends Bug 837 ↵Sven Gothel2013-10-241-1/+1
| | | | | | w/ xcode's xcrun) - Remove abs. include path. #include </usr/include/machine/types.h> -> #include <machine/types.h>
* Bug816 OSX CALayer: Issue w/ JSplitPane within Apple (Firefox, Safari - not ↵Sven Gothel2013-10-091-3/+4
| | | | | | | appletviewer) when move horizontal slider (vertical: ok) Moving horizontal slider if run as applet (Firefox, Safari - not appletviewer) doesn't move the GLCanvas even though it is resized.
* Bug 816: Fix OSX CALayer 'quirks' for AWT 1.7.0_40 - See JAWTUtil ↵Sven Gothel2013-09-241-31/+45
| | | | | | | | | JAWT_OSX_CALAYER_QUIRK_SIZE and JAWT_OSX_CALAYER_QUIRK_POSITION. - Provide quirk bits for OSX CALayer depending on used JVM/AWT and act accordingly. - TestBug816OSXCALayerPosAWT: Add resize by frame
* Complete commit 4b866d2686ab9c3fd7cf6708925b4663ad81e359: Relocate ↵Sven Gothel2013-09-131-0/+50
| | | | FFMPEGNatives.initIDS0() -> FFMPEGStaticNatives.initIDS0(); Cleanup up warnings and includes (clang); Forgot to commit new ffmpeg_static.h
* Relocate FFMPEGNatives.initIDS0() -> FFMPEGStaticNatives.initIDS0(); Cleanup ↵Sven Gothel2013-09-113-98/+86
| | | | up warnings and includes (clang).
* FFMPEGMediaPlayer: Handle use-case of having [av|sw]resample lib, but not ↵Sven Gothel2013-09-011-2/+2
| | | | | | | compiled for it -> pass Scenario ffmpeg-0.10, where we are not prepared (compiled-in) for sw-resample support. Don't use if compiled in version (CC) is < 0 (n/a), and allow to pass at load time.
* ffmpeg_impl_template: Remove DEBUG line ..Sven Gothel2013-09-011-1/+1
|
* FFMPEG/GLMediaPlayer: Fix compiler errors w/ new MingW 4.8.1: 'strsafe.h' -> ↵Sven Gothel2013-08-312-4/+10
| | | | don't use tchar.h; Fix compiler warning: Add missing (intptr_t) cast.
* GLMediaPlayer enhancements: State, Camera options, detect and act on ↵Sven Gothel2013-08-302-34/+47
| | | | | | | | | | | | | | | | | | | | orientation change (flipped), API-doc, - State - Fix state transition (initGL() error) - Camera options - options uses ';' as query separator - don't use 'default' options, driver should know - Detect and act on orientation change (flipped) - ffmpeg impl detects if flipped changes and triggers a SIZE update event. This allows application to react, i.e. re-init GL and use new TextureCoord's. Test: Works well on Windows w/ rawvideo dshow camera driver/codec. - API-doc - TexSeqEventListener/GLMediaEventListener usage / constraints (GL, ..) - State transition fix
* FFMPEGMediaPlayer: Handle v-flipped 'bottom-up' pictures ; Refine API doc ↵Sven Gothel2013-08-303-33/+52
| | | | | | | | | | | | | | | | | | 'camera ID' If linesize is < 0, it is not invalid as assumed in commit eca6a5cb1e2beda84dfbafc31ed225e272f4f3fb, but vertically flipped (bottom-up). We have to adjust the data pointers, which are moved to the upper end of memory as well and can proceed as usual. TODO: - Update texture 'mustFlipVertically' to 'false' in this case. - Later: - Allow updating texture size .. - Whole pixel-fmt/texture-lookup-shader association must scale better, i.e. extract the 'knowledge' into one class, use a static shader code using uniforms instead of hard-coded values .. etc.
* Enhance GLMediaPlayer: Full FFMPeg support, 'dshow' camera support on ↵Sven Gothel2013-08-296-136/+638
| | | | | | | | | | | | | | | | | | | | | | | | | | | | windows, 2 more pixel formats, fail-safe data handling - add support for ffmpeg 2 / libav 10 -> lavc55_lavf55_lavu52_lavr01 - add support for ffmpeg libswresample (similar to libavresample) - handle BGRA (GL type) and BGR24 (texture shader) - Change Camera URI semantics, drop 'host' and use 'path' for camera ID and use 'query' for options. - add support for Window's DShow camera selection - our camera id -> index of list of video-input devices, this gives us same behavior as w/ Linux - requires windows libs: strmiids, uuid, ole32, oleaut32 - Compiles w/ MingW64, works w/ libav/ffmpeg - TODO: test compilation w/ MingW 32bit ! - don't push data to texture if (linesize <= 0) this may happen due to buggy decoder / setup .. Tested manually on GNU/Linux x64 and Windows x64: - GNU/Linux libav 0.8, libav 9, libav 10, ffmpeg 1.2, ffmpeg 2.0 - Windows libav 0.8, libav 9, ffmpeg 2.0 - videos and camera
* Fix libav/ffmpeg compilation; FFMPEGMediaPlayer Enahncements (More YUV*, Use ↵Sven Gothel2013-08-283-119/+127
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | def. high camera options, cleanup symbols) - Fix libav/ffmpeg compilation - Split native GLContext code from JoglCommon - JoglCommon is required for ffmpeg_* c-compile/link - Supported versions now: - 0.8 53.53.51 - 9.0 54.54.52 - FFMPEGMediaPlayer - Update API doc, add compatibility .. etc - Pixel format conversions (via shader texture lookup func): - YUV420P, YUVJ420P - YUV422P, YUVJ422P - YUYV422 - Properly handle aid/vid - In camera mode: set high default values - TODO: Make it configurable via camera URI: - video_size - framerate - ? - FFMPEGDynamicLibraryBundleInfo - Cleanup symbols / remove unused (pre 53) - Add av_dict_* methods
* Fix libav/ffmpeg compilation: Use 'dot less' dir/file names; Compile ffmpeg ↵Sven Gothel2013-08-287-79/+80
| | | | | | | | | | | version dependent c-files individually and inject object files. ; ffmpeg *register_all() at setStream0(..) - Use 'dot less' dir/file names - Compile ffmpeg version dependent c-files individually and inject object files. - ffmpeg *register_all() at setStream0(..) - Only register devices if available _and_ camera is requested.
* Fix libav/ffmpeg compilation across platforms: Move header back to ↵Sven Gothel2013-08-27119-27966/+0
| | | | 'stub_includes'
* GLMediaPlayer: Add camera input / FFMPEG: Fix 'av_packet' leak and add ↵Sven Gothel2013-08-272-60/+123
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | missing symbol 'av_realloc'. - Add camera input - Use URI w/ scheme 'camera' to determine camera input is desired, use URI host as camera id. E.g. 'camera://0' for 1st camera. - AndroidGLMediaPlayerAPI14: Via 'Camera' - FFMPEG*: Via libavdevice, device name and input format - TODO: Add controls to manipulate camera if available - FFMPEG* - Add symbols - avcodec_register_all - av_realloc (was missing) - avdevice_register_all - Load libavdevice (opt) - Camera: - Use <ID> (windows) and /dev/video<ID> other OS - simply find the input format in native code - Support YUYV422 (used in video4linux2, etc.) - Stuff 2x 16bpp (YUYV) into one RGBA pixel! - Add texture format for 16bpp - Add texture lookup shader - Fix av_packet leak in readNextImpl(..) - Restore orig pointer and size values, we may have moved along within packet. Then call av_free_packet(). - Use null AudioSink if audio-id is NONE
* libav/ffmpeg: Compile/Link 2 versions of native FFMPEGMediaPlayer methods ↵Sven Gothel2013-08-26124-63/+28166
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | FFMPEGNatives -> FFMPEGv08Natives + FFMPEGv09Natives Enables FFMPEGMediaPlayer to work w/ either ffmpeg/libav version 8 or 9 w/ same JOGL binary Same C source code is compiled against 1: version 0.8 FFMPEGv08Natives lavc53.lavf53.lavu51 2: version 0.9 FFMPEGv09Natives lavc54.lavf54.lavu52.lavr01 FFMPEGv08Natives and FFMPEGv09Natives implements FFMPEGNatives, native C code uses CPP '##' macro concatenation to produce unique function names. To enable 'cpp' to find the libav* header files matching the desired version, we have placed them in the c-file's folder, issued '#include "path/file.h" and added symbolic links to allow finding same module and 'sister modules': ls -l libavformat/ .. lrwxrwxrwx 1 sven sven 13 Aug 26 12:56 libavcodec -> ../libavcodec lrwxrwxrwx 1 sven sven 14 Aug 26 12:56 libavformat -> ../libavformat lrwxrwxrwx 1 sven sven 12 Aug 26 12:57 libavutil -> ../libavutil .. At static init FFMPEGDynamicLibraryBundleInfo, determines the runtime version and instantiates the matching FFMPEGNatives, or null if non matches. FFMPEGMediaPlayer still compares the compile-time and runtime versions. FFMPEGMediaPlayer passes it's own instance to FFMPEGNatives for callbacks.
* ffmpeg/libav: Remove 'dead' audio/video frame count relation snoop-codeSven Gothel2013-08-262-190/+4
|
* libav/ffmpeg: version9: Add libavresample support ; Proper AudioFormat ↵Sven Gothel2013-08-262-28/+207
| | | | | | | | | | | | | | negotiation w/ AudioSink; Misc - Add libavresample support - Resample if avail && (!AV_SAMPLE_FMT_S16 || !prefSampleRate || !sinkSupported) - Resample to: prefSampleRate (if set), AV_SAMPLE_FMT_S16 and min(channelCount, maxChannelCount) - Proper AudioFormat negotiation w/ AudioSink; - Utilize AudioSink's 'isSupported(AudioFormat)' - Misc - use 'av_get_bytes_per_sample(fmt)' always, don't assume 2
* libav/ffmpeg: Prepare for lavc54.lavf54.lavu52Sven Gothel2013-08-252-10/+46
| | | | | | | | | | | - Add compile-time/runtime version check, fail if major versions do not match assuming binary incompatibility - Add: 'av_find_input_format' for future video input support - Manually map '/dev/video<NUM>' to video input - not working yet. - WINDOWS: Set file to '<NUM>' - Set input format string depending on OS
* AudioSink.init(..) abstract 'frame count' -> duration [ms] allowing ↵Sven Gothel2013-08-242-4/+4
| | | | | | | | | | | | | | | | | | | non-frame based AudioSink's to deal w/ desired queue sizes. - Rename AudioSink.initSink(..) -> AudioSink.init(..) - Move: "int initialFrameCount, int frameGrowAmount, int frameLimit" to "int initialQueueSize, int queueGrowAmount, int queueLimit" based on milliseconds instead of frame count. - Passing hint 'float frameDuration' to calculate frame count for fame based audio sink, i.e. ALAudioSink. - Adding sensible static final default values - AudioDataFormat: Add convenient conversion routines (samples/bytes/frame-count) - FFMPEGMediaPlayer: Retrieve audio frame size in samples per channel, pass it to AudioSink.init(..) to properly calculate frame count/limits based on duration.
* GLMediaPlayer Multithreaded Decoding: GLMediaPlayer* (Part-6) - DONESven Gothel2013-08-242-20/+28
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Multithreaded decoding and API should be considered stable by now, minor changes may apply if Android/OMX impl. requires it. We still need to solve TODO's as listed below, copied from 474ce65081ecd452215bc07ab866666cb11ca8b1. +++ - *TextureFrame OO changes: - TextureFrame extends TimeFrameI - GLMediaPlayerImpl* - Adapt to Ringbuffer changes of GlueGen commit f9f881e59c78e3036cb3f956bc97cfc3197f620d - Fix impl. method's API doc - getNextTextureImpl(..) returns video PTS - Fix audio-only playback - frame dropping shall only happen if: - previous frame has not been dropped - frame is too later - one decoded frame is already available - Don't block for decoder anymore: - nextFrame = "videoFramesDecoded.getBlocking() -> videoFramesDecoded.get()"; No 'next decoded frame avail' only could mean: - slow decoding/hardware - slow transport hence we shall not block rendering. - Add DEBUG output if using last frame - Add integer property 'jogl.debug.GLMediaPlayer.StreamWorker.delay' in milliseconds to simulate slow decoding, i.e. delay is added in StreamWorker after decoding before pushing new frame to Ringbuffer. - FFMPEGMediaPlayer: - audioFrameLimitWithVideo 128 -> 64 - audioFrameLimitAudioOnly 128 -> 32 - uses AudioSink's 'enqueueData(int pts, ByteBuffer bytes, int byteCount)' - fixes for audio-only playback +++ Working Tests: MovieSimple and MovieCube TODO-1: Fix - Android - OMXGLMediaPlayer TODO-2: - Fix issue where async audio frames arrive much later than 1st video frame, i.e. around 300ms. - Default TextureCount .. maybe 3 ? - Adding Audio synchronization ? - Find 'truth' about correlation of audio and video PTS values, currently, we assume both to be unrelated ?
* GLMediaPlayer Multithreaded Decoding: GLMediaPlayer* (Part-5) - WIPSven Gothel2013-08-232-62/+437
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | - Update/fix GLMediaPlayer API doc - GLMediaEventListener: Add event bits for all state changes to be delivered via attributesChanged(..) - StreamWorker / Decoder Thread: - Use StreamWorker only ! - Handle exceptions on StreamWorker via StreamException - Handles stream initialization and decoding (-> initStream(..)) - Split initGLStream(..) -> initStream(..) + initGL(GL) - allow initStream(..)'s implementation being executed on StreamWorker - allow GL initialization to be 'postponed' when stream is read, i.e. non blocking stream initialization (UI .. etc) - Handle EOS via END_OF_STREAM_PTS -> pause/event - Video: Use lock-free LFRingbuffer, similar to ALAudioSink (commit f18a94b3defef16e98badd6d99f2422609aa56c5) +++ - FFMPEGDynamicLibraryBundleInfo - Add avcodec's: - avcodec_get_frame_defaults, avcodec_free_frame (54.28.0), avcodec_flush_buffers, - Add avutil's: - av_frame_unref (55.0.0) - Add avformat's: - avformat_seek_file (??) +++ - FFMPEGMediaPlayer Native: - add 'snoop' video frames for a/v frame count relation. disabled per default, since no more needed due to ALAudioSink's grow-buffer usage of LFRingbuffer. - use sp_avcodec_free_frame if available - 'useRefCountedFrames=1' for libav 55.0 to cache more than one audio frame, not used since ALAudioSink's OpenAL usage does not require it (copies data once). Note: the above snooped-video frame count is used here. - use only one cached audio-frame (-> see above, OpenAL copies data once), while reusing the NIO buffer! - Perform OpenGL sync (glFinish) in native code! - find proper PTS value, i.e. either frame's PTS or DTS, see 'PTSStats'. - FFMPEGMediaPlayer Java: - use private fields - simplified code due to above changes. +++ Working Tests: MovieSimple and MovieCube TODO-1: Fix - Android - OMXGLMediaPlayer TODO-2: - Fix issue where async audio frames arrive much later than 1st video frame, i.e. around 300ms. - Default TextureCount .. maybe 3 ? - Adding Audio synchronization ? - Find 'truth' about correlation of audio and video PTS values, currently, we assume both to be unrelated ?
* GLMediaPlayer: Use URI instead of URL / Misc refinementsSven Gothel2013-08-171-16/+4
| | | | | | | | | | - GLMediaPlayer: Use URI instead of URL, allowing passing a non resolved location - Java's URL doesn't allow 'other' protocols, i.e. RTSP - GLMediaPlayer: Add Table of test streams and their location .. - FFMPEGMediaPlayer - Handle av_read_play/pause response on java side, ignore error - simply dump in DEBUG_NATIVE mode
* GLMediaPlayer Multithreaded Decoding: GLMediaPlayer* (Part-4) - WIPSven Gothel2013-08-162-31/+90
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | - Use Platform.currentTimeMillis() for accurate timing! - GLMediaPlayer / GLMediaPlayerImpl - Add DEBUG_NATIVE property jogl.debug.GLMediaPlayer.Native for verbose impl. messages, i.e. ffmpeg/libav - Add 'synchronization' section in GLMediaPlayer API doc (WIP) - Use passive non-blocking video synchronization, i.e. repeat frames instead of 'sleep'. Thx to Xerxes's suggestion. - Add flushing of cached decoded frames, allowing to remove complicated 'videoSCR_reset_latch' - FramePusher (threaded decoding): - Always create a shared context! - Release context while pausing - Pre/post 'getNextTextureImpl()' actions only at makeCurrent/release. - newFrameAvailable(..) signal after decoded frame is enqueued - FFMPEGDynamicLibraryBundleInfo - Bind add. functions of libavcodec: + "av_init_packet", + "av_new_packet", + "av_destruct_packet", - Bind add. functions of libavformat: + "avformat_seek_file", + "av_read_play", + "av_read_pause", - DEBUG property := FFMPEGMediaPlayer.DEBUG || DynamicLibraryBundleInfo.DEBUG; - FFMPEGMediaPlayer - Use libavformat's 'av_read_play()' and 'av_read_pause()', which may get utilized for network streams, e.g. RTSP - getNextTextureImpl(..): - Fix retry loop - Use postNextTextureImpl/preNextTextureImpl if desired (PSM) - Native: - Use fixed my_av_q2i32(..) macro (again) - Use INVALID_PTS marker (synced w/ Java code) - DEBUG: Dump more detailed frame information - TODO: Consider passing frame_delay, especially for repeated frames! - Tests (MovieSimple, MovieCube): - Refine KeyEvents control for seek and speed. - TODO: - Proper audio clock calculation - difficult w/ OpenAL ! - Video / Audio sync: - seek ! - streams w/ very async A/V frames - Test Streams: - Five-minute-sync-test.mp4 - Audio-Video-Sync-Test-Calibration-23.98fps-24fps.mp4 - sound_in_sync_test.mp4 - big_buck_bunny_1080p_surround.avi
* GLMediaPlayer Multithreaded Decoding: GLMediaPlayer* (Part-3) - WIPSven Gothel2013-08-142-79/+163
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | - GLMediaPlayer - Remove State.Stopped and method stop() - redundant, use pause() / destroy() - Add notion of stream IDs - Add API doc: State / Stream-ID incl. html-anchor - Expose video/audio PTS, .. - Expose optional AudioSink - Min multithreaded textureCount is 4 (EGL* and FFMPEG*) - GLMediaPlayerImpl - Move AudioSink rel. impl. to this class, allowing a tight video implementation reusing logic. - Remove 'synchronized' methods, synchronize on State where applicable - implement new methods (see above) - playSpeed is handled partially in AudioSink. If it exeeds AudioSink's capabilities, drop audio and rely solely on video sync. - video sync (WIP) - video pts delay based on geometric weight - reset video SCR if 'out of range', resync w/ PTS - - FramePusher - allow interruption when pausing/stopping, while waiting for next avail free frame to decode. - FFMPEGMediaPlayer - Add proper AudioDataFormat negotiation AudioSink <-> libav - Parse libav's SampleFormat - Remove AudioSink interaction (moved to GLMediaPlayerImpl) - Tests (MovieSimple, MovieCube): - Add aid/vid selection - Add KeyListener for actions: seek(..), play()/pause(), setPlaySpeed(..) - Dump perf-string each 2s - TODO: - Add audio sync in AudioSink, similar to GLMediaPlayer's weighted video delay, here: drop audio frames.
* Fix regression of commit 6332e13b2f0aa9818d37802302f04c90a4fa4239 causing ↵Sven Gothel2013-08-143-5/+7
| | | | optional OMX to fail to compile
* GLMediaPlayer: Add multithreaded decoding w/ textureCount > 2 where ↵Sven Gothel2013-08-103-20/+55
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | available EGL/FFMPeg. WIP! Off-thread decoding: If validated (impl) textureCount > 2, decoding happens on extra thread. If decoding requires GL context, a shared context is created for decoding thread. API Changes: - initGLStream(..): Adds 'textureCount' as argument. - TextureSequence.TexSeqEventListener.newFrameAvailable(..) exposes the new frame available - TextureSequence.TextureFrame exposes the PTS (video) Implementation: - 'int validateTextureCount(int)': implementation decides whether textureCount can be > 2, i.e. off-thread decoding allowed, default is NO w/ textureCount==2! - 'boolean requiresOffthreadGLCtx()': implementation decides whether shared context is required for off-thread decoding - 'syncFrame2Audio(TextureFrame frame)': implementation shall handle a/v sync, due to audio stream details (pts, buffered frames) - FFMPEGMediaPlayer extends GLMediaPlayerImpl, no more EGLMediaPlayerImpl (redundant) +++ - SyncedRingbuffer: Expose T[] array +++ TODO: - syncAV! - test Android
* FFMPEGPlayer Audio Sink Refactoring ..Sven Gothel2013-07-192-57/+34
| | | | | | | | | | | | | | | | - AudioSink w/ AudioFrame and formats public - ALAudioSink uses a circular buffer now, hence relaxes the one-threaded player mode - FFMPEGMediaPlayer uses multiple audio frames (equal to the ALAudioSink number) and wraps data to NIO buffer w/o copy. - FFMPEGMediaPlayer audio threading currently disabled: distorted sound Seems that the ALAudioSink's circular buffer usage is good enough for now. - Verbosity only w/ DEBUG flag - New SyncedRingbuffer for effcient synced buffering