一種“在Android 設備上,播放視頻的同時,獲取實時音頻流”的有效方案

{"type":"doc","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"這篇文章將會按照一般的需求開發流程,從需求、分析、開發,到總結,來給大家講解","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"一種“在Android 設備上,播放視頻的同時,獲取實時音頻流”的有效方案","attrs":{}},{"type":"text","text":"。","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"一、需求","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"在車載產品上,有這樣一種需求,比如我把我的Android設備通過usb線連接上車機,這時我希望我在我Android手機上的操作,能同步到車機大屏上進行顯示。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"現在很多車機基本都是Android系統了,市場上也有類似","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"CarPlay","attrs":{}},{"type":"text","text":"、","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"CarLife","attrs":{}},{"type":"text","text":"這種專門做手機投屏的軟件了。不過呢,還有一部分的車子,他們的車機用的是Linux系統,這時如何實現Android設備和linux設備之間的屏幕信息同步呢?","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/74/7450626a66624d084a782d794eb23dd9.png","alt":null,"title":"百度Carlife、蘋果Carplay","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接下來的文章,我們只介紹其中的一種場景,就是我手機播放視頻的時候,視頻內容和視頻的聲音,都同步到linux系統的車機上。而且這篇文章,我們只介紹音頻同步的部分。","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"二、分析","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"兩個設備之間的音頻同步,那就是把一個設備中的音頻數據同步到另一個設備上,一方作爲發送端,另一方作爲接收端,發送端不停的發生音頻流,接收端接收到音頻流後,進行實時的播放,即可實現我們想要的效果。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"說到設備之間的通信,相信很多同學會想到","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"tcp","attrs":{}},{"type":"text","text":"、udp這些協議了。是的,考慮到tcp協議傳輸的有序性,而udp是無序的,我們傳輸的音頻數據也是需要有序的,所有音頻數據的傳輸,我們採用tcp協議。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接下來我們再瞭解下,在Android系統上,聲音的播放流程是怎樣的呢?這對我們如何去獲取視頻播放時候的音頻流,很有幫助。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"我們先看下關於視頻的播放、錄音,Android都給我們提供了哪些API?","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"MediaRecorder","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接觸過Android錄像、錄音的同學,應該對MediaRecorder 這個API不會感到模式。是的,在Android系統上,我們可以通過MediaRecorder API來很容易的實現錄像、錄音功能,下面是關於MediaRecorder 狀態圖,具體接口的使用,感興趣的可以查看Android 官方文檔(","attrs":{}},{"type":"link","attrs":{"href":"https://developer.android.google.cn/guide/topics/media/mediarecorder?hl=zh_cn","title":null,"type":null},"content":[{"type":"text","text":"https://developer.android.google.cn/guide/topics/media/mediarecorder?hl=zh_cn","attrs":{}}]},{"type":"text","text":")。","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/68/681e5c43050a99549768e4dd26644f62.png","alt":null,"title":"Android MediaRecorder接口","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"MediaPlayer","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"另外,用於播放視頻的,Android爲我們提供了MediaPlayer的接口(","attrs":{}},{"type":"link","attrs":{"href":"https://developer.android.google.cn/guide/topics/media/mediaplayer?hl=en","title":null,"type":null},"content":[{"type":"text","text":"https://developer.android.google.cn/guide/topics/media/mediaplayer?hl=en","attrs":{}}]},{"type":"text","text":")。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"瞭解了上面的2個API,我們再來看下Android音頻系統的框架圖。","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/81/81de5a381ca5e02a1c22f42b39cd6708.png","alt":null,"title":"Android音頻系統框架","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"從上面的音頻系統框架圖(看畫紅線的部分),我們可以知道,應用上調用MediaPlayer、MediaRecorder來播放、錄音,在framewrok層都會調用到AudioTrack.cpp這個文件。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"那麼回到我們這篇文章的重點,我們需要在播放視頻的時候,把視頻的音頻流實時的截取出來。那截取音頻流的這部分工作,就可以放在","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"AudioTrack.cpp","attrs":{}},{"type":"text","text":"中進行處理。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"我們來看下AudioTrack.cpp裏面比較重要的方法:","attrs":{}}]},{"type":"codeblock","attrs":{"lang":null},"content":[{"type":"text","text":"// 播放視頻時,播放的音頻流會調用到AudioTrack.cpp的write方法\nssize_t AudioTrack::write(const void* buffer, size_t userSize, bool blocking)\n{\n if (mTransfer != TRANSFER_SYNC) {\n return INVALID_OPERATION;\n }\n\n if (isDirect()) {\n AutoMutex lock(mLock);\n int32_t flags = android_atomic_and(\n ~(CBLK_UNDERRUN | CBLK_LOOP_CYCLE | CBLK_LOOP_FINAL | CBLK_BUFFER_END),\n &mCblk->mFlags);\n if (flags & CBLK_INVALID) {\n return DEAD_OBJECT;\n }\n }\n\n if (ssize_t(userSize) < 0 || (buffer == NULL && userSize != 0)) {\n // Sanity-check: user is most-likely passing an error code, and it would\n // make the return value ambiguous (actualSize vs error).\n ALOGE(\"AudioTrack::write(buffer=%p, size=%zu (%zd)\", buffer, userSize, userSize);\n return BAD_VALUE;\n }\n\n size_t written = 0;\n Buffer audioBuffer;\n\n while (userSize >= mFrameSize) {\n audioBuffer.frameCount = userSize / mFrameSize;\n\n status_t err = obtainBuffer(&audioBuffer,\n blocking ? &ClientProxy::kForever : &ClientProxy::kNonBlocking);\n if (err < 0) {\n if (written > 0) {\n break;\n }\n if (err == TIMED_OUT || err == -EINTR) {\n err = WOULD_BLOCK;\n }\n return ssize_t(err);\n }\n\n size_t toWrite = audioBuffer.size;\n memcpy(audioBuffer.i8, buffer, toWrite); \n\n mBuffer = malloc(toWrite);\n memcpy(mBuffer,buffer,toWrite);\n buffer = ((const char *) buffer) + toWrite;\n userSize -= toWrite;\n written += toWrite;\n\n releaseBuffer(&audioBuffer);\n }\n\n if (written > 0) {\n mFramesWritten += written / mFrameSize;\n }\n return written;\n}\n","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"三、實現","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"前面分析了一通,我們的方案也比較明朗了,就是在framework層的AudioTrack.cpp文件中,通過socket,把音頻流實時的發送出來。另一個就是接收端,不停的接收發送出來的socket數據,這個socket數據就是實時的pcm流,接收方,在實時播放pcm流,就能實現音頻的實時同步了。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"關於視頻流,是如何實現同步的,大家也可以猜猜?","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"1)AudioTrack.cpp中的代碼實現","attrs":{}}]},{"type":"codeblock","attrs":{"lang":null},"content":[{"type":"text","text":"#define DEST_PORT 5046\n#define DEST_IP_ADDRESS \"192.168.7.6\"\n\nint mSocket;\nbool mSocketHasInit;\nbool mCurrentPlayMusicStream;\nstruct sockaddr_in mRemoteAddr;\n\nssize_t AudioTrack::write(const void* buffer, size_t userSize, bool blocking)\n{\n ......\n size_t toWrite = audioBuffer.size;\n memcpy(audioBuffer.i8, buffer, toWrite); \n\n mBuffer = malloc(toWrite);\n memcpy(mBuffer,buffer,toWrite);\n //我們添加的代碼:把音頻流實時的發送出去\n if(mCurrentPlayMusicStream && mSocketHasInit){\n onSocketSendData(toWrite);\n }\n ......\n}\n\nint AudioTrack::onSocketSendData(uint32_t len){\n assert(NULL != mBuffer);\n assert(-1 != len);\n\n if(!mSocketHasInit){\n initTcpSocket();\n }\n\n unsigned int ret = send(mSocket, mBuffer,len, 0); \n free(mBuffer);\n return 0;\n}\n","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"2) 接收端的代碼處理","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"(我這裏是用的Android設備調試,如果是linux系統,思路是同樣的)","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接收端的處理邏輯流程圖如下:","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 1、設置socket監聽; ","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 2、循環監聽socket端口數據;","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 3、接收到pcm流;","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 4、播放pcm流;","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/7f/7fcbd3f40c31cacf67766028f1b2af27.png","alt":null,"title":"接收端處理流程","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"codeblock","attrs":{"lang":null},"content":[{"type":"text","text":"----------- PlayActivity.java ---------------------------------------\n\n private ServerSocket mTcpServerSocket = null;\n private List mSocketList = new ArrayList<>();\n private MyTcpListener mTcpListener = null;\n\n private boolean isAccept = true;\n /**\n * 設置socket監聽\n */\n public void startTcpService() {\n Log.v(TAG,\"startTcpService();\");\n if(mTcpListener == null){\n mTcpListener = new MyTcpListener();\n }\n\n new Thread() {\n @Override\n public void run() {\n super.run();\n try {\n mTcpServerSocket = new ServerSocket();\n mTcpServerSocket.setReuseAddress(true);\n InetSocketAddress socketAddress = new InetSocketAddress(AndroidBoxProtocol.TCP_AUDIO_STREAM_PORT);\n mTcpServerSocket.bind(socketAddress);\n\n while (isAccept) {\n Socket socket = mTcpServerSocket.accept();\n mSocketList.add(socket);\n \n //開啓新線程接收socket 數據\n new Thread(new TcpServerThread(socket,mTcpListener)).start();\n }\n } catch (Exception e) {\n Log.e(\"TcpServer\", \"\" + e.toString());\n }\n }\n }.start();\n }\n/**\n * 停止socket監聽\n */\nprivate void stopTcpService(){\n\n isAccept = false;\n if(mTcpServerSocket != null){\n new Thread() {\n @Override\n public void run() {\n super.run();\n try {\n for(Socket socket:mSocketList) {\n socket.close();\n }\n mTcpServerSocket.close();\n } catch (IOException e) {\n e.printStackTrace();\n }\n }\n }.start();\n }\n }\n\n /**\n * 播放pcm 實時流\n * @param buffer\n */\n private void playPcmStream(byte[] buffer) {\n if (mAudioTrack != null && buffer != null) {\n mAudioTrack.play();\n mAudioTrack.write(buffer, 0, buffer.length);\n }\n }\n\n private Handler mUiHandler = new Handler() {\n @Override\n public void handleMessage(Message msg) {\n super.handleMessage(msg);\n switch (msg.what) {\n case HANDLER_MSG_PLAY_PCM:\n playPcmStream((byte[]) msg.obj);\n break;\n default:\n break;\n }\n }\n };\n\nprivate class MyTcpListener implements ITcpSocketListener{\n @Override\n public void onRec(Socket socket, byte[] buffer) {\n sendHandlerMsg(HANDLER_MSG_PLAY_PCM,0,buffer);\n }\n }\n","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"四、總結","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"剛開始接到這個開發需求,也是思考了良久纔想到這個方案。也再次驗證了,熟悉瞭解framework層,可以給我們提供很多實現問題的思路。中間調試的時候,也是遇到了不少的問題。不過欣喜的是結果還不錯,最後都給跑通了。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"該方案,我在Android 5.0和Android 7.0上都運行測試通過,希望對大家有幫助。","attrs":{}}]}]}
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章