Android中音視頻合成的幾種方案詳析
前言
最近工作中遇到了音視頻處理的需求,Android下音視頻合成,在當(dāng)前調(diào)研方案中主要有三大類方法:MediaMux硬解碼,mp4parser,F(xiàn)Fmepg。三種方法均可實(shí)現(xiàn),但是也有不同的局限和問(wèn)題,先將實(shí)現(xiàn)和問(wèn)題記錄于此,便于之后的總結(jié)學(xué)習(xí)。下面話不多說(shuō)了,來(lái)一起看看詳細(xì)的介紹吧。
方法一(Fail)
利用MediaMux實(shí)現(xiàn)音視頻的合成。
效果:可以實(shí)現(xiàn)音視頻的合并,利用Android原生的VideoView和SurfaceView播放正常,大部分的播放器也播放正常,但是,但是,在上傳Youtube就會(huì)出現(xiàn)問(wèn)題:音頻不連續(xù),分析主要是上傳Youtube時(shí)會(huì)被再次的壓縮,可能在壓縮的過(guò)程中出現(xiàn)音頻的幀率出現(xiàn)問(wèn)題。
分析:在MediaCodec.BufferInfo的處理中,時(shí)間戳presentationTimeUs出現(xiàn)問(wèn)題,導(dǎo)致Youtube的壓縮造成音頻的紊亂。
public static void muxVideoAndAudio(String videoPath, String audioPath, String muxPath) { try { MediaExtractor videoExtractor = new MediaExtractor(); videoExtractor.setDataSource(videoPath); MediaFormat videoFormat = null; int videoTrackIndex = -1; int videoTrackCount = videoExtractor.getTrackCount(); for (int i = 0; i < videoTrackCount; i++) { videoFormat = videoExtractor.getTrackFormat(i); String mimeType = videoFormat.getString(MediaFormat.KEY_MIME); if (mimeType.startsWith("video/")) { videoTrackIndex = i; break; } } MediaExtractor audioExtractor = new MediaExtractor(); audioExtractor.setDataSource(audioPath); MediaFormat audioFormat = null; int audioTrackIndex = -1; int audioTrackCount = audioExtractor.getTrackCount(); for (int i = 0; i < audioTrackCount; i++) { audioFormat = audioExtractor.getTrackFormat(i); String mimeType = audioFormat.getString(MediaFormat.KEY_MIME); if (mimeType.startsWith("audio/")) { audioTrackIndex = i; break; } } videoExtractor.selectTrack(videoTrackIndex); audioExtractor.selectTrack(audioTrackIndex); MediaCodec.BufferInfo videoBufferInfo = new MediaCodec.BufferInfo(); MediaCodec.BufferInfo audioBufferInfo = new MediaCodec.BufferInfo(); MediaMuxer mediaMuxer = new MediaMuxer(muxPath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4); int writeVideoTrackIndex = mediaMuxer.addTrack(videoFormat); int writeAudioTrackIndex = mediaMuxer.addTrack(audioFormat); mediaMuxer.start(); ByteBuffer byteBuffer = ByteBuffer.allocate(500 * 1024); long sampleTime = 0; { videoExtractor.readSampleData(byteBuffer, 0); if (videoExtractor.getSampleFlags() == MediaExtractor.SAMPLE_FLAG_SYNC) { videoExtractor.advance(); } videoExtractor.readSampleData(byteBuffer, 0); long secondTime = videoExtractor.getSampleTime(); videoExtractor.advance(); long thirdTime = videoExtractor.getSampleTime(); sampleTime = Math.abs(thirdTime - secondTime); } videoExtractor.unselectTrack(videoTrackIndex); videoExtractor.selectTrack(videoTrackIndex); while (true) { int readVideoSampleSize = videoExtractor.readSampleData(byteBuffer, 0); if (readVideoSampleSize < 0) { break; } videoBufferInfo.size = readVideoSampleSize; videoBufferInfo.presentationTimeUs += sampleTime; videoBufferInfo.offset = 0; //noinspection WrongConstant videoBufferInfo.flags = MediaCodec.BUFFER_FLAG_SYNC_FRAME;//videoExtractor.getSampleFlags() mediaMuxer.writeSampleData(writeVideoTrackIndex, byteBuffer, videoBufferInfo); videoExtractor.advance(); } while (true) { int readAudioSampleSize = audioExtractor.readSampleData(byteBuffer, 0); if (readAudioSampleSize < 0) { break; } audioBufferInfo.size = readAudioSampleSize; audioBufferInfo.presentationTimeUs += sampleTime; audioBufferInfo.offset = 0; //noinspection WrongConstant audioBufferInfo.flags = MediaCodec.BUFFER_FLAG_SYNC_FRAME;// videoExtractor.getSampleFlags() mediaMuxer.writeSampleData(writeAudioTrackIndex, byteBuffer, audioBufferInfo); audioExtractor.advance(); } mediaMuxer.stop(); mediaMuxer.release(); videoExtractor.release(); audioExtractor.release(); } catch (IOException e) { e.printStackTrace(); } }
方法二(Success)
public static void muxVideoAudio(String videoFilePath, String audioFilePath, String outputFile) { try { MediaExtractor videoExtractor = new MediaExtractor(); videoExtractor.setDataSource(videoFilePath); MediaExtractor audioExtractor = new MediaExtractor(); audioExtractor.setDataSource(audioFilePath); MediaMuxer muxer = new MediaMuxer(outputFile, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4); videoExtractor.selectTrack(0); MediaFormat videoFormat = videoExtractor.getTrackFormat(0); int videoTrack = muxer.addTrack(videoFormat); audioExtractor.selectTrack(0); MediaFormat audioFormat = audioExtractor.getTrackFormat(0); int audioTrack = muxer.addTrack(audioFormat); LogUtil.d(TAG, "Video Format " + videoFormat.toString()); LogUtil.d(TAG, "Audio Format " + audioFormat.toString()); boolean sawEOS = false; int frameCount = 0; int offset = 100; int sampleSize = 256 * 1024; ByteBuffer videoBuf = ByteBuffer.allocate(sampleSize); ByteBuffer audioBuf = ByteBuffer.allocate(sampleSize); MediaCodec.BufferInfo videoBufferInfo = new MediaCodec.BufferInfo(); MediaCodec.BufferInfo audioBufferInfo = new MediaCodec.BufferInfo(); videoExtractor.seekTo(0, MediaExtractor.SEEK_TO_CLOSEST_SYNC); audioExtractor.seekTo(0, MediaExtractor.SEEK_TO_CLOSEST_SYNC); muxer.start(); while (!sawEOS) { videoBufferInfo.offset = offset; videoBufferInfo.size = videoExtractor.readSampleData(videoBuf, offset); if (videoBufferInfo.size < 0 || audioBufferInfo.size < 0) { sawEOS = true; videoBufferInfo.size = 0; } else { videoBufferInfo.presentationTimeUs = videoExtractor.getSampleTime(); //noinspection WrongConstant videoBufferInfo.flags = videoExtractor.getSampleFlags(); muxer.writeSampleData(videoTrack, videoBuf, videoBufferInfo); videoExtractor.advance(); frameCount++; } } boolean sawEOS2 = false; int frameCount2 = 0; while (!sawEOS2) { frameCount2++; audioBufferInfo.offset = offset; audioBufferInfo.size = audioExtractor.readSampleData(audioBuf, offset); if (videoBufferInfo.size < 0 || audioBufferInfo.size < 0) { sawEOS2 = true; audioBufferInfo.size = 0; } else { audioBufferInfo.presentationTimeUs = audioExtractor.getSampleTime(); //noinspection WrongConstant audioBufferInfo.flags = audioExtractor.getSampleFlags(); muxer.writeSampleData(audioTrack, audioBuf, audioBufferInfo); audioExtractor.advance(); } } muxer.stop(); muxer.release(); LogUtil.d(TAG,"Output: "+outputFile); } catch (IOException e) { LogUtil.d(TAG, "Mixer Error 1 " + e.getMessage()); } catch (Exception e) { LogUtil.d(TAG, "Mixer Error 2 " + e.getMessage()); } }
方法三
利用mp4parser實(shí)現(xiàn)
mp4parser是一個(gè)視頻處理的開(kāi)源工具箱,由于mp4parser里的方法都依靠工具箱里的一些內(nèi)容,所以需要將這些內(nèi)容打包成jar包,放到自己的工程里,才能對(duì)mp4parser的方法進(jìn)行調(diào)用。
compile “com.googlecode.mp4parser:isoparser:1.1.21”
問(wèn)題:上傳Youtube壓縮后,視頻數(shù)據(jù)丟失嚴(yán)重,大部分就只剩下一秒鐘的時(shí)長(zhǎng),相當(dāng)于把視頻變成圖片了,囧
public boolean mux(String videoFile, String audioFile, final String outputFile) { if (isStopMux) { return false; } Movie video; try { video = MovieCreator.build(videoFile); } catch (RuntimeException e) { e.printStackTrace(); return false; } catch (IOException e) { e.printStackTrace(); return false; } Movie audio; try { audio = MovieCreator.build(audioFile); } catch (IOException e) { e.printStackTrace(); return false; } catch (NullPointerException e) { e.printStackTrace(); return false; } Track audioTrack = audio.getTracks().get(0); video.addTrack(audioTrack); Container out = new DefaultMp4Builder().build(video); FileOutputStream fos; try { fos = new FileOutputStream(outputFile); } catch (FileNotFoundException e) { e.printStackTrace(); return false; } BufferedWritableFileByteChannel byteBufferByteChannel = new BufferedWritableFileByteChannel(fos); try { out.writeContainer(byteBufferByteChannel); byteBufferByteChannel.close(); fos.close(); if (isStopMux) { return false; } runOnUiThread(new Runnable() { @Override public void run() { mCustomeProgressDialog.setProgress(100); goShareActivity(outputFile); // FileUtils.insertMediaDB(AddAudiosActivity.this,outputFile);// } }); } catch (IOException e) { e.printStackTrace(); if (mCustomeProgressDialog.isShowing()) { mCustomeProgressDialog.dismiss(); } ToastUtil.showShort(getString(R.string.process_failed)); return false; } return true; } private static class BufferedWritableFileByteChannel implements WritableByteChannel { private static final int BUFFER_CAPACITY = 2000000; private boolean isOpen = true; private final OutputStream outputStream; private final ByteBuffer byteBuffer; private final byte[] rawBuffer = new byte[BUFFER_CAPACITY]; private BufferedWritableFileByteChannel(OutputStream outputStream) { this.outputStream = outputStream; this.byteBuffer = ByteBuffer.wrap(rawBuffer); } @Override public int write(ByteBuffer inputBuffer) throws IOException { int inputBytes = inputBuffer.remaining(); if (inputBytes > byteBuffer.remaining()) { dumpToFile(); byteBuffer.clear(); if (inputBytes > byteBuffer.remaining()) { throw new BufferOverflowException(); } } byteBuffer.put(inputBuffer); return inputBytes; } @Override public boolean isOpen() { return isOpen; } @Override public void close() throws IOException { dumpToFile(); isOpen = false; } private void dumpToFile() { try { outputStream.write(rawBuffer, 0, byteBuffer.position()); } catch (IOException e) { throw new RuntimeException(e); } } }
方法四
利用FFmpeg大法
FFmpeg 由于其豐富的 codec 插件,詳細(xì)的文檔說(shuō)明,并且與其調(diào)試復(fù)雜量大的編解碼代碼(是的,用 MediaCodec 實(shí)現(xiàn)起來(lái)十分啰嗦和繁瑣)還是不如調(diào)試一行 ffmpeg 命令來(lái)的簡(jiǎn)單。
Merge Video /Audio and retain both audios
可以實(shí)現(xiàn),兼容性強(qiáng),但由于是軟解碼,合并速度很慢,忍受不了,而相應(yīng)的FFmpeg優(yōu)化還不太了解,囧…….
總結(jié)
以上就是這篇文章的全部?jī)?nèi)容了,希望本文的內(nèi)容對(duì)大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,如果有疑問(wèn)大家可以留言交流,謝謝大家對(duì)腳本之家的支持。
相關(guān)文章
Android Studio使用ButterKnife和Zelezny的方法
這篇文章主要為大家詳細(xì)介紹了Android Studio使用ButterKnife和Zelezny的方法,具有一定的參考價(jià)值,感興趣的小伙伴們可以參考一下2018-04-04Android中LayoutInflater.inflater()的正確打開(kāi)方式
這篇文章主要給大家介紹了關(guān)于Android中LayoutInflater.inflater()的正確打開(kāi)方式,文中通過(guò)示例代碼介紹的非常詳細(xì),需要的朋友可以參考借鑒,下面隨著小編來(lái)一起學(xué)習(xí)學(xué)習(xí)吧2018-12-12Android打開(kāi)GPS導(dǎo)航并獲取位置信息返回null解決方案
最近在做一個(gè) Android 項(xiàng)目,需要用到GPS獲取位置信息,從 API 查了一下,發(fā)現(xiàn)獲取位置信息僅需極其簡(jiǎn)單的一句即可getLastKnownLocation(LocationManager.GPS_PROVIDER)郁悶的是一直為null,于是搜集整理下,曬出來(lái)與大家分享2013-01-01詳解Android自定義控件屬性TypedArray以及attrs
這篇文章主要為大家介紹了android自定義控件屬性TypedArray以及attrs,感興趣的小伙伴們可以參考一下2016-01-01Android?ViewPager?+?Fragment實(shí)現(xiàn)滑動(dòng)頁(yè)面效果
本文通過(guò)實(shí)例代碼較詳細(xì)的給大家介紹了Android?ViewPager?+?Fragment實(shí)現(xiàn)滑動(dòng)頁(yè)面效果,需要的朋友可以參考下2018-06-06解決android 顯示內(nèi)容被底部導(dǎo)航欄遮擋的問(wèn)題
今天小編就為大家分享一篇解決android 顯示內(nèi)容被底部導(dǎo)航欄遮擋的問(wèn)題,具有很好的參考價(jià)值,希望對(duì)大家有所幫助。一起跟隨小編過(guò)來(lái)看看吧2018-07-07Android打空包后提示沒(méi)有"android:exported"的屬性設(shè)置問(wèn)題解決
這篇文章主要介紹了Android打空包后提示沒(méi)有"android:exported"的屬性設(shè)置問(wèn)題的解決方法,文中通過(guò)圖文將解決的辦法介紹的非常詳細(xì),對(duì)大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友可以參考下2023-02-02