| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
information (if any), FFMPEG* utilizes NativeLibrary.get[Native]LibraryPath()
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
master-clock, enabling proper AV sync w/ untouched audio
We can finally utilize the added pass through audio PTS, see commits
- GlueGen 52725b4c6525487f93407f529dc0a758b387a4fc
- JOAL 12029f1ec1d8afa576e1ac61655f318cc37c1d16
This enables us to use the audio PTS as the master-clock and adjust video to the untouched audio.
In case no audio is selected/playing or audio is muted,
we sync merely on the system-clock (SCR) w/o audio.
AV granularity is 22ms, however, since the ALAudioSink PTS may be a little late,
it renders even a slightly better sync in case of too early audio (d_apts < 0).
Since video frames are sync'ed to audio, the resync procedure may result
in a hysteresis swinging into sync. This might be notable at start
and when resumed audio or after seek.
We leave the audio frames untouched to reduce processing burden
and allow non-disrupted listening.
Passed AV sync tests
- Five-minute-sync-test.mp4
- Audio-Video-Sync-Test-Calibration-23.98fps-24fps.mp4
- Audio-Video-Sync-Test-2.mkv
|
|
|
|
| |
270172bcbd91f96d4a38a3d73e23d744f57a25b8) and joal (commit 03f4bb63ce8a358b1c2ef303480e1887d72ecb2e)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
showing test-texture. Adding stop(); (API Change)
- allow multiple initGL(..) @ uninitialized and initialized
- allows usage before stream is ready
- using a test-texture @ uninitialized
- adding stop()
API change
- initStream() -> playStream()
- play() -> resume()
FFMPEG: Added 'ready' check for robustness
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
InterruptedException(s)
Below is an updated list of Condition-Wait classifications
as described in Bug 1211.
This list includes recent changes on GlueGen
regarding this Bug 1211.
A followup commit will address the unit tests.
- Noncancelable + Persistent-Wait
- GLMediaPlayerImpl.StreamWorker thread (changed)
- pauses thread in case of intr
- Cancelable + Persistent-Wait:
- LFRingbuffer.getImpl(..)
- LFRingbuffer.waitForFreeSlots(int)
- SyncedRingbuffer.getImpl(..)
- SyncedRingbuffer.waitForFreeSlots(int)
- FunctionTask.invokeOnNewThread(..) (changed)
- RunnableTask.invokeOnNewThread(..) (changed)
- SharedResourceRunner.run()
- SharedResourceRunner.doAndWait() (changed)
- SharedResourceRunner.start() (changed)
- SharedResourceRunner.stop() (changed)
- GLMediaPlayerImpl.StreamWorker ctor (changed)
- GLMediaPlayerImpl caller thread actions do*() (changed)
- AndroidGLMediaPlayerAPI14.getNextTextureImpl(..) (changed)
- DisplayImpl.enqueueEvent(..) (changed)
-> Persistent-Wait
-> Cancels wait and NEWTEvent
-> dispatchMessage(NEWTEventTask): always notifyCaller!
- GLDrawableHelper.invoke(..) (changed)
- DefaultEDTUtil.waitUntilIdle() (changed)
- DefaultEDTUtil.waitUntilStopped() (changed)
- DefaultEDTUtil.invokeImpl(..) (changed)
- DefaultEDTUtil.NEDT.run(..) (changed)
- AWTEDTUtil.waitUntilStopped(..) (changed)
- AWTEDTUtil.invokeImpl(..) (changed)
- AWTEDTUtil.NEDT.run(..) (changed)
- SWTEDTUtil.invokeImpl(..) (changed)
- SWTEDTUtil.waitUntilStopped(..) (changed)
- SWTEDTUtil.NEDT.run(..) (changed)
- GLWorkerThread.invokeAndWait(..)
- GLWorkerThread.start() (changed)
- GLWorkerThread.WorkerRunnable.run() (changed)
- Animator.run() (changed)
- AnimatorBase.finishLifecycleAction() (changed)
- OSXUtil.RunOnMainThread(..) (changed)
- SingletonInstanceServerSocket.Server.shutdown() (changed)
- SingletonInstanceServerSocket.Server.start() (changed)
- com.jogamp.audio.windows.waveout.Mixer.shutdown() (changed)
- Extending/Using InterruptSource.Thread (changed)
- DefaultEDTUtil.NEDT
- AWTEDTUtil.NEDT
- SWTEDTUtil.NEDT
- GLWorkerThread.thread
- Mixer.FillerThread
- Mixer.MixerThread
- Using InterruptSource.Thread (changed)
- TempFileCache
- LauncherTempFileCache
- Animator.thread
- SingletonInstanceServerSocket.Server.serverThread
Deprecated:
- FunctionTask.invoke(..) (changed)
-> on current thread, no wait -> deprecated
- RunnableTask.invoke(..) (changed)
-> on current thread, no wait -> deprecated
|
|
|
|
|
|
|
|
|
| |
sed -i 's/javax\.media\.opengl/com\.jogamp\.opengl/g' `grep -Rl "javax\.media\.opengl" src`
sed -i 's/javax\.media\.nativewindow/com\.jogamp\.nativewindow/g' `grep -Rl "javax\.media\.nativewindow" src`
sed -i 's/javax\/media\//com\/jogamp\//g' `grep -Rl "javax/media/" src`
sed -i 's/javax\/media\//com\/jogamp\//g' `grep -Rl "javax/media/" doc`
Manually edited all occurences within make/**
|
|
|
|
|
|
|
|
|
| |
- ShaderCode:
- Using Uri
- Also encode the rel. 'includeFile' (was missing earlier)
- GLMediaPlayer
- Exposes Uri in API, removed URI
|
|
|
|
| |
unboxing
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
or due branching)
- AWT TextRenderer: Add throw new InternalError("fontRenderContext never initialized!"); FIXME!
- GLContextImpl.hasFBOImpl(): Fix serious NPE issue if extCache is null
- GLDrawableFactoryImpl.createOffscreenDrawableImpl(..):
- Fix NPE issue w/ null drawable
- Fix resetting GammaRamp by ensuring originalGammaRamp will be set at 1st setGammaRamp(..)
- AndroidGLMediaPlayerAPI14: Fix NPE: Use already resolved local referenced
- EGLDrawableFactory: Fix NPE: Only operate on non null surface!
- ALAudioSink.dequeueBuffer(..): Only resolve releasedBuffer elements if not null
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
c47bc86ae2ee268a1f38c5580d11f93d7f8d6e74)
- Change non static accesses to static members using declaring type
- Change indirect accesses to static members to direct accesses (accesses through subtypes)
- Add final modifier to private fields
- Add final modifier to method parameters
- Add final modifier to local variables
- Remove unnecessary casts
- Remove unnecessary '$NON-NLS$' tags
- Remove trailing white spaces on all lines
|
|
|
|
| |
API stability
|
|
|
|
|
|
|
|
|
|
|
|
| |
multiple media textures (Android) or shared GL context are not usable.
- GLMediaPlayer:
- TEXTURE_COUNT_MIN is the new minimum: '1' - i.e. no multithreading, single threaded player
- TEXTURE_COUNT_DEFAULT is '4' - multithreaded
- GLMediaPlayerImpl:
- Add Single threaded mode, but perform initStreamImpl(..) off-thread.
-
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
orientation change (flipped), API-doc,
- State
- Fix state transition (initGL() error)
- Camera options
- options uses ';' as query separator
- don't use 'default' options, driver should know
- Detect and act on orientation change (flipped)
- ffmpeg impl detects if flipped changes and triggers a SIZE update event.
This allows application to react, i.e. re-init GL and use new TextureCoord's.
Test: Works well on Windows w/ rawvideo dshow camera driver/codec.
- API-doc
- TexSeqEventListener/GLMediaEventListener usage / constraints (GL, ..)
- State transition fix
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
windows, 2 more pixel formats, fail-safe data handling
- add support for ffmpeg 2 / libav 10 -> lavc55_lavf55_lavu52_lavr01
- add support for ffmpeg libswresample (similar to libavresample)
- handle BGRA (GL type) and BGR24 (texture shader)
- Change Camera URI semantics, drop 'host' and use 'path' for camera ID
and use 'query' for options.
- add support for Window's DShow camera selection
- our camera id -> index of list of video-input devices,
this gives us same behavior as w/ Linux
- requires windows libs: strmiids, uuid, ole32, oleaut32
- Compiles w/ MingW64, works w/ libav/ffmpeg
- TODO: test compilation w/ MingW 32bit !
- don't push data to texture if (linesize <= 0)
this may happen due to buggy decoder / setup ..
Tested manually on GNU/Linux x64 and Windows x64:
- GNU/Linux libav 0.8, libav 9, libav 10, ffmpeg 1.2, ffmpeg 2.0
- Windows libav 0.8, libav 9, ffmpeg 2.0
- videos and camera
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
missing symbol 'av_realloc'.
- Add camera input
- Use URI w/ scheme 'camera' to determine camera input is desired,
use URI host as camera id.
E.g. 'camera://0' for 1st camera.
- AndroidGLMediaPlayerAPI14: Via 'Camera'
- FFMPEG*: Via libavdevice, device name and input format
- TODO: Add controls to manipulate camera if available
- FFMPEG*
- Add symbols
- avcodec_register_all
- av_realloc (was missing)
- avdevice_register_all
- Load libavdevice (opt)
- Camera:
- Use <ID> (windows) and /dev/video<ID> other OS
- simply find the input format in native code
- Support YUYV422 (used in video4linux2, etc.)
- Stuff 2x 16bpp (YUYV) into one RGBA pixel!
- Add texture format for 16bpp
- Add texture lookup shader
- Fix av_packet leak in readNextImpl(..)
- Restore orig pointer and size values,
we may have moved along within packet.
Then call av_free_packet().
- Use null AudioSink if audio-id is NONE
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Add EOS detection, setAudioVolume(..)
GLMediaPlayerImpl.initStreamGL(..):
Only require a minimum texture count of 2,
which is the bare minimum to allow our algorithm to work,
i.e. having a 'lastFrame' and avail/playing ringbuffer have each one frame.
Android's MediaPlayer API can only deal w/ one SurfaceTexture,
hence we have to fake a second SurfaceTextureFrame w/ same content
to allow our implementation to work w/ the threaded decoder (min 2 frames).
|
|
|
|
| |
allowing to change the audio volume.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Multithreaded decoding and API should be considered stable by now,
minor changes may apply if Android/OMX impl. requires it.
We still need to solve TODO's as listed below, copied from 474ce65081ecd452215bc07ab866666cb11ca8b1.
+++
- *TextureFrame OO changes:
- TextureFrame extends TimeFrameI
- GLMediaPlayerImpl*
- Adapt to Ringbuffer changes of GlueGen commit f9f881e59c78e3036cb3f956bc97cfc3197f620d
- Fix impl. method's API doc
- getNextTextureImpl(..) returns video PTS
- Fix audio-only playback
- frame dropping shall only happen if:
- previous frame has not been dropped
- frame is too later
- one decoded frame is already available
- Don't block for decoder anymore:
- nextFrame = "videoFramesDecoded.getBlocking() -> videoFramesDecoded.get()";
No 'next decoded frame avail' only could mean:
- slow decoding/hardware
- slow transport
hence we shall not block rendering.
- Add DEBUG output if using last frame
- Add integer property 'jogl.debug.GLMediaPlayer.StreamWorker.delay' in milliseconds
to simulate slow decoding, i.e. delay is added in StreamWorker after decoding
before pushing new frame to Ringbuffer.
- FFMPEGMediaPlayer:
- audioFrameLimitWithVideo 128 -> 64
- audioFrameLimitAudioOnly 128 -> 32
- uses AudioSink's 'enqueueData(int pts, ByteBuffer bytes, int byteCount)'
- fixes for audio-only playback
+++
Working Tests: MovieSimple and MovieCube
TODO-1: Fix
- Android
- OMXGLMediaPlayer
TODO-2:
- Fix issue where async audio frames arrive much later than 1st video frame, i.e. around 300ms.
- Default TextureCount .. maybe 3 ?
- Adding Audio synchronization ?
- Find 'truth' about correlation of audio and video PTS values,
currently, we assume both to be unrelated ?
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- Update/fix GLMediaPlayer API doc
- GLMediaEventListener: Add event bits for all state changes to be delivered via attributesChanged(..)
- StreamWorker / Decoder Thread:
- Use StreamWorker only !
- Handle exceptions on StreamWorker via StreamException
- Handles stream initialization and decoding (-> initStream(..))
- Split initGLStream(..) -> initStream(..) + initGL(GL)
- allow initStream(..)'s implementation being executed on StreamWorker
- allow GL initialization to be 'postponed' when stream is read,
i.e. non blocking stream initialization (UI .. etc)
- Handle EOS via END_OF_STREAM_PTS -> pause/event
- Video: Use lock-free LFRingbuffer, similar to
ALAudioSink (commit f18a94b3defef16e98badd6d99f2422609aa56c5)
+++
- FFMPEGDynamicLibraryBundleInfo
- Add avcodec's:
- avcodec_get_frame_defaults, avcodec_free_frame (54.28.0), avcodec_flush_buffers,
- Add avutil's:
- av_frame_unref (55.0.0)
- Add avformat's:
- avformat_seek_file (??)
+++
- FFMPEGMediaPlayer Native:
- add 'snoop' video frames for a/v frame count relation.
disabled per default, since no more needed due to ALAudioSink's
grow-buffer usage of LFRingbuffer.
- use sp_avcodec_free_frame if available
- 'useRefCountedFrames=1' for libav 55.0 to cache more than one audio frame,
not used since ALAudioSink's OpenAL usage does not require it (copies data once).
Note: the above snooped-video frame count is used here.
- use only one cached audio-frame (-> see above, OpenAL copies data once),
while reusing the NIO buffer!
- Perform OpenGL sync (glFinish) in native code!
- find proper PTS value, i.e. either frame's PTS or DTS,
see 'PTSStats'.
- FFMPEGMediaPlayer Java:
- use private fields
- simplified code due to above changes.
+++
Working Tests: MovieSimple and MovieCube
TODO-1: Fix
- Android
- OMXGLMediaPlayer
TODO-2:
- Fix issue where async audio frames arrive much later than 1st video frame, i.e. around 300ms.
- Default TextureCount .. maybe 3 ?
- Adding Audio synchronization ?
- Find 'truth' about correlation of audio and video PTS values,
currently, we assume both to be unrelated ?
|
|
|
|
|
|
|
|
|
|
| |
- GLMediaPlayer: Use URI instead of URL, allowing passing a non resolved location
- Java's URL doesn't allow 'other' protocols, i.e. RTSP
- GLMediaPlayer: Add Table of test streams and their location ..
- FFMPEGMediaPlayer
- Handle av_read_play/pause response on java side, ignore error - simply dump in DEBUG_NATIVE mode
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- Use Platform.currentTimeMillis() for accurate timing!
- GLMediaPlayer / GLMediaPlayerImpl
- Add DEBUG_NATIVE property jogl.debug.GLMediaPlayer.Native
for verbose impl. messages, i.e. ffmpeg/libav
- Add 'synchronization' section in GLMediaPlayer API doc (WIP)
- Use passive non-blocking video synchronization,
i.e. repeat frames instead of 'sleep'.
Thx to Xerxes's suggestion.
- Add flushing of cached decoded frames,
allowing to remove complicated 'videoSCR_reset_latch'
- FramePusher (threaded decoding):
- Always create a shared context!
- Release context while pausing
- Pre/post 'getNextTextureImpl()' actions only
at makeCurrent/release.
- newFrameAvailable(..) signal after decoded frame is enqueued
- FFMPEGDynamicLibraryBundleInfo
- Bind add. functions of libavcodec:
+ "av_init_packet",
+ "av_new_packet",
+ "av_destruct_packet",
- Bind add. functions of libavformat:
+ "avformat_seek_file",
+ "av_read_play",
+ "av_read_pause",
- DEBUG property := FFMPEGMediaPlayer.DEBUG || DynamicLibraryBundleInfo.DEBUG;
- FFMPEGMediaPlayer
- Use libavformat's 'av_read_play()' and 'av_read_pause()',
which may get utilized for network streams, e.g. RTSP
- getNextTextureImpl(..):
- Fix retry loop
- Use postNextTextureImpl/preNextTextureImpl if desired (PSM)
- Native:
- Use fixed my_av_q2i32(..) macro (again)
- Use INVALID_PTS marker (synced w/ Java code)
- DEBUG: Dump more detailed frame information
- TODO: Consider passing frame_delay, especially for repeated frames!
- Tests (MovieSimple, MovieCube):
- Refine KeyEvents control for seek and speed.
- TODO:
- Proper audio clock calculation - difficult w/ OpenAL !
- Video / Audio sync:
- seek !
- streams w/ very async A/V frames
- Test Streams:
- Five-minute-sync-test.mp4
- Audio-Video-Sync-Test-Calibration-23.98fps-24fps.mp4
- sound_in_sync_test.mp4
- big_buck_bunny_1080p_surround.avi
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- GLMediaPlayer
- Remove State.Stopped and method stop() - redundant, use pause() / destroy()
- Add notion of stream IDs
- Add API doc: State / Stream-ID incl. html-anchor
- Expose video/audio PTS, ..
- Expose optional AudioSink
- Min multithreaded textureCount is 4 (EGL* and FFMPEG*)
- GLMediaPlayerImpl
- Move AudioSink rel. impl. to this class,
allowing a tight video implementation reusing logic.
- Remove 'synchronized' methods, synchronize on State
where applicable
- implement new methods (see above)
- playSpeed is handled partially in AudioSink.
If it exeeds AudioSink's capabilities, drop audio and rely solely on video sync.
- video sync (WIP)
- video pts delay based on geometric weight
- reset video SCR if 'out of range', resync w/ PTS
-
- FramePusher
- allow interruption when pausing/stopping,
while waiting for next avail free frame to decode.
- FFMPEGMediaPlayer
- Add proper AudioDataFormat negotiation AudioSink <-> libav
- Parse libav's SampleFormat
- Remove AudioSink interaction (moved to GLMediaPlayerImpl)
- Tests (MovieSimple, MovieCube):
- Add aid/vid selection
- Add KeyListener for actions: seek(..), play()/pause(), setPlaySpeed(..)
- Dump perf-string each 2s
- TODO:
- Add audio sync in AudioSink, similar to GLMediaPlayer's weighted video delay,
here: drop audio frames.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
available EGL/FFMPeg. WIP!
Off-thread decoding:
If validated (impl) textureCount > 2, decoding happens on extra thread.
If decoding requires GL context, a shared context is created for decoding thread.
API Changes:
- initGLStream(..): Adds 'textureCount' as argument.
- TextureSequence.TexSeqEventListener.newFrameAvailable(..) exposes the new frame available
- TextureSequence.TextureFrame exposes the PTS (video)
Implementation:
- 'int validateTextureCount(int)': implementation decides whether textureCount can be > 2, i.e. off-thread decoding allowed,
default is NO w/ textureCount==2!
- 'boolean requiresOffthreadGLCtx()': implementation decides whether shared context is required for off-thread decoding
- 'syncFrame2Audio(TextureFrame frame)': implementation shall handle a/v sync, due to audio stream details (pts, buffered frames)
- FFMPEGMediaPlayer extends GLMediaPlayerImpl, no more EGLMediaPlayerImpl (redundant)
+++
- SyncedRingbuffer: Expose T[] array
+++
TODO:
- syncAV!
- test Android
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- AudioSink w/ AudioFrame and formats public
- ALAudioSink uses a circular buffer now, hence relaxes the one-threaded player mode
- FFMPEGMediaPlayer uses multiple audio frames (equal to the ALAudioSink number)
and wraps data to NIO buffer w/o copy.
- FFMPEGMediaPlayer audio threading currently disabled: distorted sound
Seems that the ALAudioSink's circular buffer usage is good enough for now.
- Verbosity only w/ DEBUG flag
- New SyncedRingbuffer for effcient synced buffering
|
|
|
|
| |
implementations.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
General enhancments.
For details about TextureSequence/GLMediaPlayer shader collaboration w/ your own shader source,
see TextureSequence and TexCubeES2 / MovieSimple demo.
TextureSequence allows implementations to provide their own texture lookup function
which may provide color space conversion (YUV) .. or other runtime hw-accel features.
Have a look at the next commit, which provides an Libav/FFMpeg implementation w/ YUV/RGB shader conversion.
MovieCube adds keyboard control (Android: firm touch on display to launch keyboard, don't break it though :)
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Texture
- Add TextureSequence, base interface of GLMediaPlayer to genralize texture streams
- TextureSequence / GLMediaPlayer: Use inner classes for event and texture data
- getLastTexture() shall never return 'null', initialization of TextureSequence (initGLStream(..), etc)
shall provide a TextureFrame w/ the stream's dimension.
- GLMediaPlayerImpl.createTexImageImpl() y-flip defaults to 'false'
impl. shall define y-flip, if required.
- Added MovieCube demo
- Fix Texture: initialize aspectRation for 'wrapping' ctor
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
GLMediaPlayer:
Merging 'initStream()' and 'initGL()' to 'initGLStream()' due to incompatible/buggy implementations (Android/Tegra)
requiring the GL texture being setup before preparing the stream.
This also implies that w/o an GL context we cannot fetch the stream information (size, ..)
hence we need to evaluate this detail (FIXME).
'getNextTexture(GL gl, boolean blocking)' can request the impl. to block
GLMediaEventListener:
The TextureFrame not yet available, adding 'when'
|
|
|
|
|
| |
- Factory falls back to NullGLMediaPlayer allowing to test on platforms where no player is available.
- MovieSimple (c) to JogAmp since it is no more derived from the old project.
|
|
|
|
| |
connected URL. API doc: Useing html table for state chart
|
|
|
|
|
|
|
|
|
| |
initStream(URL) + initGL(GL)) .. IllegalStateException if wrong. Using internet streams of BigBuckBunny, if avail.
- Splitting the initialization in stream and GL allows using the stream information (eg: size, ..)
for setting the GLDrawable properties ..
- Make the impl. more bullet proof ..
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Android API 14
- Introduce states
- Customize / Access texture target,count,features.
- Expose TextureFrame.
- Use 'long' for all time values in msec.
- Mark information optional in API doc (fps, bps, ..)
|
|
Android API 14 MediaPlayer impl of GLMediaPlayer.
Android API 14 MediaPlayer allows usage of OMX AL direct decode to texture via libstagefright (OMX AL usage included).
Status: Untested, not working - Need to fix native OMX IL (stream detect and split) and/or GStreamer implementation.
|