public final class MediaSync extends Object
MediaSync is generally used like this:
MediaSync sync = new MediaSync(); sync.setSurface(surface); Surface inputSurface = sync.createInputSurface(); ... // MediaCodec videoDecoder = ...; videoDecoder.configure(format, inputSurface, ...); ... sync.setAudioTrack(audioTrack); sync.setCallback(new MediaSync.Callback() { @Override public void onAudioBufferConsumed(MediaSync sync, ByteBuffer audioBuffer, int bufferId) { ... } }, null); // This needs to be done since sync is paused on creation. sync.setPlaybackParams(new PlaybackParams().setSpeed(1.f)); for (;;) { ... // send video frames to surface for rendering, e.g., call // videoDecoder.releaseOutputBuffer(videoOutputBufferIx, videoPresentationTimeNs); // More details are available as below. ... sync.queueAudio(audioByteBuffer, bufferId, audioPresentationTimeUs); // non-blocking. // The audioByteBuffer and bufferId will be returned via callback. // More details are available as below. ... ... } sync.setPlaybackParams(new PlaybackParams().setSpeed(0.f)); sync.release(); sync = null; // The following code snippet illustrates how video/audio raw frames are created by // MediaCodec's, how they are fed to MediaSync and how they are returned by MediaSync. // This is the callback from MediaCodec. onOutputBufferAvailable(MediaCodec codec, int bufferId, BufferInfo info) { // ... if (codec == videoDecoder) { // surface timestamp must contain media presentation time in nanoseconds. codec.releaseOutputBuffer(bufferId, 1000 * info.presentationTime); } else { ByteBuffer audioByteBuffer = codec.getOutputBuffer(bufferId); sync.queueAudio(audioByteBuffer, bufferId, info.presentationTime); } // ... } // This is the callback from MediaSync. onAudioBufferConsumed(MediaSync sync, ByteBuffer buffer, int bufferId) { // ... audioDecoder.releaseBuffer(bufferId, false); // ... }The client needs to configure corresponding sink by setting the Surface and/or AudioTrack based on the stream type it will play.
For video, the client needs to call createInputSurface()
to obtain a surface on
which it will render video frames.
For audio, the client needs to set up audio track correctly, e.g., using AudioTrack.MODE_STREAM
. The audio buffers are sent to MediaSync directly via queueAudio(java.nio.ByteBuffer, int, long)
, and are returned to the client via MediaSync.Callback.onAudioBufferConsumed(android.media.MediaSync, java.nio.ByteBuffer, int)
asynchronously. The client should not modify an audio buffer till it's returned.
The client can optionally pre-fill audio/video buffers by setting playback rate to 0.0, and then feed audio/video buffers to corresponding components. This can reduce possible initial underrun.
Modifier and Type | Class and Description |
---|---|
static class |
MediaSync.Callback
MediaSync callback interface.
|
static interface |
MediaSync.OnErrorListener
Interface definition of a callback to be invoked when there
has been an error during an asynchronous operation (other errors
will throw exceptions at method call time).
|
Modifier and Type | Field and Description |
---|---|
static int |
MEDIASYNC_ERROR_AUDIOTRACK_FAIL
Audio track failed.
|
static int |
MEDIASYNC_ERROR_SURFACE_FAIL
The surface failed to handle video buffers.
|
Constructor and Description |
---|
MediaSync()
Class constructor.
|
Modifier and Type | Method and Description |
---|---|
Surface |
createInputSurface()
Requests a Surface to use as the input.
|
protected void |
finalize()
Called by the garbage collector on an object when garbage collection
determines that there are no more references to the object.
|
void |
flush()
Flushes all buffers from the sync object.
|
PlaybackParams |
getPlaybackParams()
Gets the playback rate using
PlaybackParams . |
SyncParams |
getSyncParams()
Gets the A/V sync mode.
|
MediaTimestamp |
getTimestamp()
Get current playback position.
|
void |
queueAudio(ByteBuffer audioData,
int bufferId,
long presentationTimeUs)
Queues the audio data asynchronously for playback (AudioTrack must be in streaming mode).
|
void |
release()
Make sure you call this when you're done to free up any opened
component instance instead of relying on the garbage collector
to do this for you at some point in the future.
|
void |
setAudioTrack(AudioTrack audioTrack)
Sets the audio track for MediaSync.
|
void |
setCallback(MediaSync.Callback cb,
Handler handler)
Sets an asynchronous callback for actionable MediaSync events.
|
void |
setOnErrorListener(MediaSync.OnErrorListener listener,
Handler handler)
Sets an asynchronous callback for error events.
|
void |
setPlaybackParams(PlaybackParams params)
Sets playback rate using
PlaybackParams . |
void |
setSurface(Surface surface)
Sets the output surface for MediaSync.
|
void |
setSyncParams(SyncParams params)
Sets A/V sync mode.
|
public static final int MEDIASYNC_ERROR_AUDIOTRACK_FAIL
MediaSync.OnErrorListener
,
Constant Field Valuespublic static final int MEDIASYNC_ERROR_SURFACE_FAIL
MediaSync.OnErrorListener
,
Constant Field Valuespublic MediaSync()
protected void finalize()
Object
finalize
method to dispose of
system resources or to perform other cleanup.
The general contract of finalize
is that it is invoked
if and when the JavaTM virtual
machine has determined that there is no longer any
means by which this object can be accessed by any thread that has
not yet died, except as a result of an action taken by the
finalization of some other object or class which is ready to be
finalized. The finalize
method may take any action, including
making this object available again to other threads; the usual purpose
of finalize
, however, is to perform cleanup actions before
the object is irrevocably discarded. For example, the finalize method
for an object that represents an input/output connection might perform
explicit I/O transactions to break the connection before the object is
permanently discarded.
The finalize
method of class Object
performs no
special action; it simply returns normally. Subclasses of
Object
may override this definition.
The Java programming language does not guarantee which thread will
invoke the finalize
method for any given object. It is
guaranteed, however, that the thread that invokes finalize will not
be holding any user-visible synchronization locks when finalize is
invoked. If an uncaught exception is thrown by the finalize method,
the exception is ignored and finalization of that object terminates.
After the finalize
method has been invoked for an object, no
further action is taken until the Java virtual machine has again
determined that there is no longer any means by which this object can
be accessed by any thread that has not yet died, including possible
actions by other objects or classes which are ready to be finalized,
at which point the object may be discarded.
The finalize
method is never invoked more than once by a Java
virtual machine for any given object.
Any exception thrown by the finalize
method causes
the finalization of this object to be halted, but is otherwise
ignored.
public final void release()
public void setCallback(MediaSync.Callback cb, Handler handler)
This method can be called multiple times to update a previously set callback. If the handler is changed, undelivered notifications scheduled for the old handler may be dropped.
Do not call this inside callback.
cb
- The callback that will run. Use null
to stop receiving callbacks.handler
- The Handler that will run the callback. Use null
to use MediaSync's
internal handler if it exists.public void setOnErrorListener(MediaSync.OnErrorListener listener, Handler handler)
This method can be called multiple times to update a previously set listener. If the handler is changed, undelivered notifications scheduled for the old handler may be dropped.
Do not call this inside callback.
listener
- The callback that will run. Use null
to stop receiving callbacks.handler
- The Handler that will run the callback. Use null
to use MediaSync's
internal handler if it exists.public void setSurface(Surface surface)
Currently, this is only supported in the Initialized state.
surface
- Specify a surface on which to render the video data.IllegalArgumentException
- if the surface has been released, is invalid,
or can not be connected.IllegalStateException
- if setting the surface is not supported, e.g.
not in the Initialized state, or another surface has already been set.public void setAudioTrack(AudioTrack audioTrack)
Currently, this is only supported in the Initialized state.
audioTrack
- Specify an AudioTrack through which to render the audio data.IllegalArgumentException
- if the audioTrack has been released, or is invalid.IllegalStateException
- if setting the audio track is not supported, e.g.
not in the Initialized state, or another audio track has already been set.public final Surface createInputSurface()
setSurface(android.view.Surface)
.
The application is responsible for calling release() on the Surface when done.
IllegalStateException
- if not set, or another input surface has
already been created.public void setPlaybackParams(PlaybackParams params)
PlaybackParams
.
When using MediaSync with AudioTrack
, set playback params using this
call instead of calling it directly on the track, so that the sync is aware of
the params change.
This call also works if there is no audio track.
params
- the playback params to use. Speed
is the ratio between desired playback rate and normal one. 1.0 means
normal playback speed. 0.0 means pause. Value larger than 1.0 means faster playback,
while value between 0.0 and 1.0 for slower playback. Note: the normal rate
does not change as a result of this call. To restore the original rate at any time,
use speed of 1.0.IllegalStateException
- if the internal sync engine or the audio track has not
been initialized.IllegalArgumentException
- if the params are not supported.public PlaybackParams getPlaybackParams()
PlaybackParams
.IllegalStateException
- if the internal sync engine or the audio track has not
been initialized.public void setSyncParams(SyncParams params)
params
- the A/V sync params to applyIllegalStateException
- if the internal player engine has not been
initialized.IllegalArgumentException
- if params are not supported.public SyncParams getSyncParams()
IllegalStateException
- if the internal player engine has not been
initialized.public void flush()
All pending unprocessed audio and video buffers are discarded. If an audio track was configured, it is flushed and stopped. If a video output surface was configured, the last frame queued to it is left on the frame. Queue a blank video frame to clear the surface,
No callbacks are received for the flushed buffers.
IllegalStateException
- if the internal player engine has not been
initialized.public MediaTimestamp getTimestamp()
The MediaTimestamp represents how the media time correlates to the system time in a linear fashion using an anchor and a clock rate. During regular playback, the media time moves fairly constantly (though the anchor frame may be rebased to a current system time, the linear correlation stays steady). Therefore, this method does not need to be called often.
To help users get current playback position, this method always anchors the timestamp
to the current system time
, so
MediaTimestamp.getAnchorMediaTimeUs()
can be used as current playback position.
null
if no timestamp
is available, e.g. because the media player has not been initialized.MediaTimestamp
public void queueAudio(ByteBuffer audioData, int bufferId, long presentationTimeUs)
flush()
, it will be restarted.audioData
- the buffer that holds the data to play. This buffer will be returned
to the client via registered callback.bufferId
- an integer used to identify audioData. It will be returned to
the client along with audioData. This helps applications to keep track of audioData,
e.g., it can be used to store the output buffer index used by the audio codec.presentationTimeUs
- the presentation timestamp in microseconds for the first frame
in the buffer.IllegalStateException
- if audio track is not set or internal configureation
has not been done correctly.