Unable to mux both audio and video

Alexey

I'm writing an app that records screen capture and audio using MediaCodec. I use MediaMuxer to mux video and audio to create mp4 file. I successfuly managed to write video and audio separately, however when I try muxing them together live, the result is unexpected. Either audio is played without video, or video is played right after audio. My guess is that I'm doing something wrong with timestamps, but I can't figure out what exactly. I already looked at those examples: https://github.com/OnlyInAmerica/HWEncoderExperiments/tree/audiotest/HWEncoderExperiments/src/main/java/net/openwatch/hwencoderexperiments and the ones on bigflake.com and was not able to find the answer.

Here's my media formats configurations:

    mVideoFormat = createMediaFormat();

    private static MediaFormat createVideoFormat() {
    MediaFormat format = MediaFormat.createVideoFormat(
            Preferences.MIME_TYPE, mScreenWidth, mScreenHeight);
    format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
            MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
    format.setInteger(MediaFormat.KEY_BIT_RATE, Preferences.BIT_RATE);
    format.setInteger(MediaFormat.KEY_FRAME_RATE, Preferences.FRAME_RATE);
    format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,
            Preferences.IFRAME_INTERVAL);
    return format;
}
    mAudioFormat = createAudioFormat();

    private static MediaFormat createAudioFormat() {
    MediaFormat format = new MediaFormat();
    format.setString(MediaFormat.KEY_MIME, "audio/mp4a-latm");
    format.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
    format.setInteger(MediaFormat.KEY_SAMPLE_RATE, 44100);
    format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1);
    format.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
    return format;
}

Audio and video encoders, muxer:

      mVideoEncoder = MediaCodec.createEncoderByType(Preferences.MIME_TYPE);
    mVideoEncoder.configure(mVideoFormat, null, null,
            MediaCodec.CONFIGURE_FLAG_ENCODE);
    mInputSurface = new InputSurface(mVideoEncoder.createInputSurface(),
            mSavedEglContext);
    mVideoEncoder.start();
    if (recordAudio){
        audioBufferSize = AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_CONFIGURATION_MONO, 
        AudioFormat.ENCODING_PCM_16BIT);
        mAudioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 44100,
        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, audioBufferSize);
        mAudioRecorder.startRecording();

        mAudioEncoder = MediaCodec.createEncoderByType("audio/mp4a-latm");
        mAudioEncoder.configure(mAudioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mAudioEncoder.start();
    }
    try {
        String fileId = String.valueOf(System.currentTimeMillis());
        mMuxer = new MediaMuxer(dir.getPath() + "/Video"
                + fileId + ".mp4",
                MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    } catch (IOException ioe) {
        throw new RuntimeException("MediaMuxer creation failed", ioe);
    }
    mVideoTrackIndex = -1;
    mAudioTrackIndex = -1;
    mMuxerStarted = false;

I use this to set up video timestamps:

mInputSurface.setPresentationTime(mSurfaceTexture.getTimestamp());
drainVideoEncoder(false);

And this to set up audio time stamps:

lastQueuedPresentationTimeStampUs = getNextQueuedPresentationTimeStampUs();

if(endOfStream)
    mAudioEncoder.queueInputBuffer(inputBufferIndex, 0, audioBuffer.length, lastQueuedPresentationTimeStampUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
 else
     mAudioEncoder.queueInputBuffer(inputBufferIndex, 0, audioBuffer.length, lastQueuedPresentationTimeStampUs, 0);


  mAudioBufferInfo.presentationTimeUs = getNextDeQueuedPresentationTimeStampUs();
  mMuxer.writeSampleData(mAudioTrackIndex, encodedData,
                           mAudioBufferInfo);
  lastDequeuedPresentationTimeStampUs = mAudioBufferInfo.presentationTimeUs;


  private static long getNextQueuedPresentationTimeStampUs(){
    long nextQueuedPresentationTimeStampUs = (lastQueuedPresentationTimeStampUs > lastDequeuedPresentationTimeStampUs) 
            ? (lastQueuedPresentationTimeStampUs + 1) : (lastDequeuedPresentationTimeStampUs + 1);
    Log.i(TAG, "nextQueuedPresentationTimeStampUs: " + nextQueuedPresentationTimeStampUs);
    return nextQueuedPresentationTimeStampUs;
}


private static long getNextDeQueuedPresentationTimeStampUs(){
    Log.i(TAG, "nextDequeuedPresentationTimeStampUs: " + (lastDequeuedPresentationTimeStampUs + 1));
    lastDequeuedPresentationTimeStampUs ++;
    return lastDequeuedPresentationTimeStampUs;
}

I took it from this example https://github.com/OnlyInAmerica/HWEncoderExperiments/blob/audiotest/HWEncoderExperiments/src/main/java/net/openwatch/hwencoderexperiments/AudioEncodingTest.java in order to avoid "timestampUs XXX < lastTimestampUs XXX" error

Can someone help me figure out the problem, please?

fadden

It looks like you're using system-provided time stamps for video, but a simple counter for audio. Unless somehow the video timestamp is being used to seed the audio every frame and it's just not shown above.

For audio and video to play in sync, you need to have the same presentation time stamp on audio and video frames that are expected to be presented at the same time.

See also this related question.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

how to mux audio and video in gstreamer

From Dev

Mux Audio and Video in C#

From Dev

how to mux audio and video in gstreamer

From Dev

android - How to mux audio file and video file?

From Dev

ffmpeg fails to mux video and audio properly

From Dev

FFmpeg not recognizing any video files - cannot mux video and audio

From Dev

Merge audio and video using ffmpeg and retain both video and audio voice

From Dev

ffmpeg unable to record video with audio how to solve it?

From Dev

Add Shorter audio to video with loop with volume control of both audio and video and with only for a specific duration using FFMpeg

From Dev

How to demux audio and video from rtspsrc and then save to file using matroska mux?

From Dev

How to mux Vorbis audio and VP8 video into WebM format with ffmpeg?

From Dev

iOS 8.4 specific: AVPlayer not playing both video & audio and no errors

From Dev

How to provide both audio data and video data to MediaMux

From Dev

Kurento WebRTC not recording unless both audio and video streams are present

From Dev

How to provide both audio data and video data to MediaMux

From Dev

Video and audio in all applications unable to play past first frame/second

From Dev

Why when both audio and video MediaStreamTracks enabled parameter set to false, "on-air" indicator keeps being turned on?

From Dev

When playing DVDs with VLC media player, both video and audio pause several times per second

From Dev

Will a media player running both audio & video through the HDMI port be heard on USB soundcard?

From Dev

Why when both audio and video MediaStreamTracks enabled parameter set to false, "on-air" indicator keeps being turned on?

From Dev

Android Extract Decode Encode Mux Audio

From Dev

Trim Audio to Video Length

From Dev

gstreamer pipeline video AND audio

From Dev

Custom Video and Audio Settings

From Dev

WebRTC save video and audio

From Dev

swf to video with audio

From Dev

Storing video/audio in PouchDB

From Dev

python converting video to audio

From Dev

Audio and video are out of sync

Related Related

  1. 1

    how to mux audio and video in gstreamer

  2. 2

    Mux Audio and Video in C#

  3. 3

    how to mux audio and video in gstreamer

  4. 4

    android - How to mux audio file and video file?

  5. 5

    ffmpeg fails to mux video and audio properly

  6. 6

    FFmpeg not recognizing any video files - cannot mux video and audio

  7. 7

    Merge audio and video using ffmpeg and retain both video and audio voice

  8. 8

    ffmpeg unable to record video with audio how to solve it?

  9. 9

    Add Shorter audio to video with loop with volume control of both audio and video and with only for a specific duration using FFMpeg

  10. 10

    How to demux audio and video from rtspsrc and then save to file using matroska mux?

  11. 11

    How to mux Vorbis audio and VP8 video into WebM format with ffmpeg?

  12. 12

    iOS 8.4 specific: AVPlayer not playing both video & audio and no errors

  13. 13

    How to provide both audio data and video data to MediaMux

  14. 14

    Kurento WebRTC not recording unless both audio and video streams are present

  15. 15

    How to provide both audio data and video data to MediaMux

  16. 16

    Video and audio in all applications unable to play past first frame/second

  17. 17

    Why when both audio and video MediaStreamTracks enabled parameter set to false, "on-air" indicator keeps being turned on?

  18. 18

    When playing DVDs with VLC media player, both video and audio pause several times per second

  19. 19

    Will a media player running both audio & video through the HDMI port be heard on USB soundcard?

  20. 20

    Why when both audio and video MediaStreamTracks enabled parameter set to false, "on-air" indicator keeps being turned on?

  21. 21

    Android Extract Decode Encode Mux Audio

  22. 22

    Trim Audio to Video Length

  23. 23

    gstreamer pipeline video AND audio

  24. 24

    Custom Video and Audio Settings

  25. 25

    WebRTC save video and audio

  26. 26

    swf to video with audio

  27. 27

    Storing video/audio in PouchDB

  28. 28

    python converting video to audio

  29. 29

    Audio and video are out of sync

HotTag

Archive