一种“在Android 设备上,播放视频的同时,获取实时音频流”的有效方案

{"type":"doc","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"这篇文章将会按照一般的需求开发流程,从需求、分析、开发,到总结,来给大家讲解","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"一种“在Android 设备上,播放视频的同时,获取实时音频流”的有效方案","attrs":{}},{"type":"text","text":"。","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"一、需求","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"在车载产品上,有这样一种需求,比如我把我的Android设备通过usb线连接上车机,这时我希望我在我Android手机上的操作,能同步到车机大屏上进行显示。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"现在很多车机基本都是Android系统了,市场上也有类似","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"CarPlay","attrs":{}},{"type":"text","text":"、","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"CarLife","attrs":{}},{"type":"text","text":"这种专门做手机投屏的软件了。不过呢,还有一部分的车子,他们的车机用的是Linux系统,这时如何实现Android设备和linux设备之间的屏幕信息同步呢?","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/74/7450626a66624d084a782d794eb23dd9.png","alt":null,"title":"百度Carlife、苹果Carplay","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接下来的文章,我们只介绍其中的一种场景,就是我手机播放视频的时候,视频内容和视频的声音,都同步到linux系统的车机上。而且这篇文章,我们只介绍音频同步的部分。","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"二、分析","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"两个设备之间的音频同步,那就是把一个设备中的音频数据同步到另一个设备上,一方作为发送端,另一方作为接收端,发送端不停的发生音频流,接收端接收到音频流后,进行实时的播放,即可实现我们想要的效果。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"说到设备之间的通信,相信很多同学会想到","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"tcp","attrs":{}},{"type":"text","text":"、udp这些协议了。是的,考虑到tcp协议传输的有序性,而udp是无序的,我们传输的音频数据也是需要有序的,所有音频数据的传输,我们采用tcp协议。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接下来我们再了解下,在Android系统上,声音的播放流程是怎样的呢?这对我们如何去获取视频播放时候的音频流,很有帮助。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"我们先看下关于视频的播放、录音,Android都给我们提供了哪些API?","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"MediaRecorder","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接触过Android录像、录音的同学,应该对MediaRecorder 这个API不会感到模式。是的,在Android系统上,我们可以通过MediaRecorder API来很容易的实现录像、录音功能,下面是关于MediaRecorder 状态图,具体接口的使用,感兴趣的可以查看Android 官方文档(","attrs":{}},{"type":"link","attrs":{"href":"https://developer.android.google.cn/guide/topics/media/mediarecorder?hl=zh_cn","title":null,"type":null},"content":[{"type":"text","text":"https://developer.android.google.cn/guide/topics/media/mediarecorder?hl=zh_cn","attrs":{}}]},{"type":"text","text":")。","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/68/681e5c43050a99549768e4dd26644f62.png","alt":null,"title":"Android MediaRecorder接口","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"MediaPlayer","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"另外,用于播放视频的,Android为我们提供了MediaPlayer的接口(","attrs":{}},{"type":"link","attrs":{"href":"https://developer.android.google.cn/guide/topics/media/mediaplayer?hl=en","title":null,"type":null},"content":[{"type":"text","text":"https://developer.android.google.cn/guide/topics/media/mediaplayer?hl=en","attrs":{}}]},{"type":"text","text":")。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"了解了上面的2个API,我们再来看下Android音频系统的框架图。","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/81/81de5a381ca5e02a1c22f42b39cd6708.png","alt":null,"title":"Android音频系统框架","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"从上面的音频系统框架图(看画红线的部分),我们可以知道,应用上调用MediaPlayer、MediaRecorder来播放、录音,在framewrok层都会调用到AudioTrack.cpp这个文件。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"那么回到我们这篇文章的重点,我们需要在播放视频的时候,把视频的音频流实时的截取出来。那截取音频流的这部分工作,就可以放在","attrs":{}},{"type":"text","marks":[{"type":"strong","attrs":{}}],"text":"AudioTrack.cpp","attrs":{}},{"type":"text","text":"中进行处理。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"我们来看下AudioTrack.cpp里面比较重要的方法:","attrs":{}}]},{"type":"codeblock","attrs":{"lang":null},"content":[{"type":"text","text":"// 播放视频时,播放的音频流会调用到AudioTrack.cpp的write方法\nssize_t AudioTrack::write(const void* buffer, size_t userSize, bool blocking)\n{\n if (mTransfer != TRANSFER_SYNC) {\n return INVALID_OPERATION;\n }\n\n if (isDirect()) {\n AutoMutex lock(mLock);\n int32_t flags = android_atomic_and(\n ~(CBLK_UNDERRUN | CBLK_LOOP_CYCLE | CBLK_LOOP_FINAL | CBLK_BUFFER_END),\n &mCblk->mFlags);\n if (flags & CBLK_INVALID) {\n return DEAD_OBJECT;\n }\n }\n\n if (ssize_t(userSize) < 0 || (buffer == NULL && userSize != 0)) {\n // Sanity-check: user is most-likely passing an error code, and it would\n // make the return value ambiguous (actualSize vs error).\n ALOGE(\"AudioTrack::write(buffer=%p, size=%zu (%zd)\", buffer, userSize, userSize);\n return BAD_VALUE;\n }\n\n size_t written = 0;\n Buffer audioBuffer;\n\n while (userSize >= mFrameSize) {\n audioBuffer.frameCount = userSize / mFrameSize;\n\n status_t err = obtainBuffer(&audioBuffer,\n blocking ? &ClientProxy::kForever : &ClientProxy::kNonBlocking);\n if (err < 0) {\n if (written > 0) {\n break;\n }\n if (err == TIMED_OUT || err == -EINTR) {\n err = WOULD_BLOCK;\n }\n return ssize_t(err);\n }\n\n size_t toWrite = audioBuffer.size;\n memcpy(audioBuffer.i8, buffer, toWrite); \n\n mBuffer = malloc(toWrite);\n memcpy(mBuffer,buffer,toWrite);\n buffer = ((const char *) buffer) + toWrite;\n userSize -= toWrite;\n written += toWrite;\n\n releaseBuffer(&audioBuffer);\n }\n\n if (written > 0) {\n mFramesWritten += written / mFrameSize;\n }\n return written;\n}\n","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"三、实现","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"前面分析了一通,我们的方案也比较明朗了,就是在framework层的AudioTrack.cpp文件中,通过socket,把音频流实时的发送出来。另一个就是接收端,不停的接收发送出来的socket数据,这个socket数据就是实时的pcm流,接收方,在实时播放pcm流,就能实现音频的实时同步了。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"关于视频流,是如何实现同步的,大家也可以猜猜?","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"1)AudioTrack.cpp中的代码实现","attrs":{}}]},{"type":"codeblock","attrs":{"lang":null},"content":[{"type":"text","text":"#define DEST_PORT 5046\n#define DEST_IP_ADDRESS \"192.168.7.6\"\n\nint mSocket;\nbool mSocketHasInit;\nbool mCurrentPlayMusicStream;\nstruct sockaddr_in mRemoteAddr;\n\nssize_t AudioTrack::write(const void* buffer, size_t userSize, bool blocking)\n{\n ......\n size_t toWrite = audioBuffer.size;\n memcpy(audioBuffer.i8, buffer, toWrite); \n\n mBuffer = malloc(toWrite);\n memcpy(mBuffer,buffer,toWrite);\n //我们添加的代码:把音频流实时的发送出去\n if(mCurrentPlayMusicStream && mSocketHasInit){\n onSocketSendData(toWrite);\n }\n ......\n}\n\nint AudioTrack::onSocketSendData(uint32_t len){\n assert(NULL != mBuffer);\n assert(-1 != len);\n\n if(!mSocketHasInit){\n initTcpSocket();\n }\n\n unsigned int ret = send(mSocket, mBuffer,len, 0); \n free(mBuffer);\n return 0;\n}\n","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"2) 接收端的代码处理","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"(我这里是用的Android设备调试,如果是linux系统,思路是同样的)","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接收端的处理逻辑流程图如下:","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 1、设置socket监听; ","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 2、循环监听socket端口数据;","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 3、接收到pcm流;","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" 4、播放pcm流;","attrs":{}}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/7f/7fcbd3f40c31cacf67766028f1b2af27.png","alt":null,"title":"接收端处理流程","style":[{"key":"width","value":"75%"},{"key":"bordertype","value":"boxShadow"}],"href":"","fromPaste":false,"pastePass":false}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"codeblock","attrs":{"lang":null},"content":[{"type":"text","text":"----------- PlayActivity.java ---------------------------------------\n\n private ServerSocket mTcpServerSocket = null;\n private List mSocketList = new ArrayList<>();\n private MyTcpListener mTcpListener = null;\n\n private boolean isAccept = true;\n /**\n * 设置socket监听\n */\n public void startTcpService() {\n Log.v(TAG,\"startTcpService();\");\n if(mTcpListener == null){\n mTcpListener = new MyTcpListener();\n }\n\n new Thread() {\n @Override\n public void run() {\n super.run();\n try {\n mTcpServerSocket = new ServerSocket();\n mTcpServerSocket.setReuseAddress(true);\n InetSocketAddress socketAddress = new InetSocketAddress(AndroidBoxProtocol.TCP_AUDIO_STREAM_PORT);\n mTcpServerSocket.bind(socketAddress);\n\n while (isAccept) {\n Socket socket = mTcpServerSocket.accept();\n mSocketList.add(socket);\n \n //开启新线程接收socket 数据\n new Thread(new TcpServerThread(socket,mTcpListener)).start();\n }\n } catch (Exception e) {\n Log.e(\"TcpServer\", \"\" + e.toString());\n }\n }\n }.start();\n }\n/**\n * 停止socket监听\n */\nprivate void stopTcpService(){\n\n isAccept = false;\n if(mTcpServerSocket != null){\n new Thread() {\n @Override\n public void run() {\n super.run();\n try {\n for(Socket socket:mSocketList) {\n socket.close();\n }\n mTcpServerSocket.close();\n } catch (IOException e) {\n e.printStackTrace();\n }\n }\n }.start();\n }\n }\n\n /**\n * 播放pcm 实时流\n * @param buffer\n */\n private void playPcmStream(byte[] buffer) {\n if (mAudioTrack != null && buffer != null) {\n mAudioTrack.play();\n mAudioTrack.write(buffer, 0, buffer.length);\n }\n }\n\n private Handler mUiHandler = new Handler() {\n @Override\n public void handleMessage(Message msg) {\n super.handleMessage(msg);\n switch (msg.what) {\n case HANDLER_MSG_PLAY_PCM:\n playPcmStream((byte[]) msg.obj);\n break;\n default:\n break;\n }\n }\n };\n\nprivate class MyTcpListener implements ITcpSocketListener{\n @Override\n public void onRec(Socket socket, byte[] buffer) {\n sendHandlerMsg(HANDLER_MSG_PLAY_PCM,0,buffer);\n }\n }\n","attrs":{}}]},{"type":"heading","attrs":{"align":null,"level":2},"content":[{"type":"text","text":"四、总结","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"刚开始接到这个开发需求,也是思考了良久才想到这个方案。也再次验证了,熟悉了解framework层,可以给我们提供很多实现问题的思路。中间调试的时候,也是遇到了不少的问题。不过欣喜的是结果还不错,最后都给跑通了。","attrs":{}}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"该方案,我在Android 5.0和Android 7.0上都运行测试通过,希望对大家有帮助。","attrs":{}}]}]}
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章