融雲技術分享:基於WebRTC的實時音視頻首幀顯示時間優化實踐

{"type":"doc","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"本文由融雲技術團隊原創投稿,作者是融雲WebRTC高級工程師蘇道,轉載請註明出處。"}]},{"type":"heading","attrs":{"align":null,"level":1},"content":[{"type":"text","text":"1、引言"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"在一個典型的IM應用裏,使用實時音視頻聊天功能時,視頻首幀的顯示,是一項很重要的用戶體驗指標。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"本文主要通過對WebRTC接收端的音視頻處理過程分析,來了解和優化視頻首幀的顯示時間,並進行了總結和分享。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"(本文同步發佈於:"},{"type":"link","attrs":{"href":"http://www.52im.net/thread-3169-1-1.html","title":null},"content":[{"type":"text","text":"http://www.52im.net/thread-3169-1-1.html"}]},{"type":"text","text":")"}]},{"type":"heading","attrs":{"align":null,"level":1},"content":[{"type":"text","text":"2、什麼是WebRTC?"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"對於沒接觸過實時音視頻技術的人來說,總是看到別人在提WebRTC,那WebRTC是什麼?我們有必要簡單介紹一下。"}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/f8/f8b1af0e9025fae6b6d994dee32116bb.jpeg","alt":"","title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"說到 WebRTC,我們不得不提到 Gobal IP Solutions,簡稱 GIPS。這是一家 1990 年成立於瑞典斯德哥爾摩的 VoIP 軟件開發商,提供了可以說是世界上最好的語音引擎。相關介紹詳見《"},{"type":"link","attrs":{"href":"http://www.52im.net/thread-227-1-1.html","title":null},"content":[{"type":"text","text":"訪談WebRTC標準之父:WebRTC的過去、現在和未來"}]},{"type":"text","text":"》。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"Skype、騰訊 QQ、WebEx、Vidyo 等都使用了它的音頻處理引擎,包含了受專利保護的回聲消除算法,適應網絡抖動和丟包的低延遲算法,以及先進的音頻編解碼器。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"Google 在 Gtalk 中也使用了 GIPS 的授權。Google 在 2011 年以6820萬美元收購了 GIPS,並將其源代碼開源,加上在 2010 年收購的 On2 獲取到的 VPx 系列視頻編解碼器(詳見《"},{"type":"link","attrs":{"href":"http://www.52im.net/thread-274-1-1.html","title":null},"content":[{"type":"text","text":"即時通訊音視頻開發(十七):視頻編碼H.264、VP8的前世今生"}]},{"type":"text","text":"》),WebRTC 開源項目應運而生,即 GIPS 音視頻引擎 + 替換掉 H.264 的 VPx 視頻編解碼器。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"在此之後,Google 又將在 Gtalk 中用於 P2P 打洞的開源項目 libjingle 融合進了 WebRTC。目前 WebRTC 提供了包括 Web、iOS、Android、Mac、Windows、Linux 在內的所有平臺支持。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"(以上介紹,引用自《"},{"type":"link","attrs":{"href":"http://www.52im.net/thread-1631-1-1.html","title":null},"content":[{"type":"text","text":"了不起的WebRTC:生態日趨完善,或將實時音視頻技術白菜化"}]},{"type":"text","text":"》)"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"雖然WebRTC的目標是實現跨平臺的Web端實時音視頻通訊,但因爲核心層代碼的Native、高品質和內聚性,開發者很容易進行除Web平臺外的移殖和應用。目前爲止,WebRTC幾乎是是業界能免費得到的唯一高品質實時音視頻通訊技術。"}]},{"type":"heading","attrs":{"align":null,"level":1},"content":[{"type":"text","text":"3、流程介紹"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"一個典型的實時音視頻處理流程大概是這樣:"}]},{"type":"bulletedlist","content":[{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"1)發送端採集音視頻數據,通過編碼器生成幀數據;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"2)這數據被打包成 RTP 包,通過 ICE 通道發送到接收端;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"3)接收端接收 RTP 包,取出 RTP payload,完成組幀的操作;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"4)之後音視頻解碼器解碼幀數據,生成視頻圖像或音頻 PCM 數據。"}]}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"如下圖所示:"}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/f6/f6984f40bd867555aa4b97d8221bf79e.jpeg","alt":"","title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"本文所涉及的參數調整,談論的部分位於上圖中的第 4 步。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"因爲是接收端,所以會收到對方的 Offer 請求。先設置 SetRemoteDescription 再 SetLocalDescription。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"如下圖藍色部分: "}]},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/2c/2c628afc436947a4fe7d84590ff5c40d.jpeg","alt":"","title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"heading","attrs":{"align":null,"level":1},"content":[{"type":"text","text":"4、參數調整"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"4.1 視頻參數調整"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"當收到 Signal 線程 SetRemoteDescription 後,會在 Worker 線程中創建 VideoReceiveStream 對象。具體流程爲 SetRemoteDescription -> VideoChannel::SetRemoteContent_w 創建 WebRtcVideoReceiveStream。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"WebRtcVideoReceiveStream 包含了一個 VideoReceiveStream 類型 stream_ 對象, 通過 webrtc::VideoReceiveStream* Call::CreateVideoReceiveStream 創建。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"創建後立即啓動 VideoReceiveStream 工作,即調用 Start() 方法。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"此時 VideoReceiveStream 包含一個 RtpVideoStreamReceiver 對象準備開始處理 video RTP 包。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"接收方創建 createAnswer 後通過 setLocalDescription 設置 local descritpion。 "}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"對應會在 Worker 線程中 setLocalContent_w 方法中根據 SDP 設置 channel 的接收參數,最終會調用到 WebRtcVideoReceiveStream::SetRecvParameters。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"WebRtcVideoReceiveStream::SetRecvParameters 實現如下:"}]},{"type":"blockquote","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"void WebRtcVideoChannel::WebRtcVideoReceiveStream::SetRecvParameters("}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    const ChangedRecvParameters& params) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  bool video_needs_recreation = false;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  bool flexfec_needs_recreation = false;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(params.codec_settings) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    ConfigureCodecs(*params.codec_settings);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    video_needs_recreation = true;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(params.rtp_header_extensions) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    config_.rtp.extensions = *params.rtp_header_extensions;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    flexfec_config_.rtp_header_extensions = *params.rtp_header_extensions;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    video_needs_recreation = true;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    flexfec_needs_recreation = true;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(params.flexfec_payload_type) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    ConfigureFlexfecCodec(*params.flexfec_payload_type);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    flexfec_needs_recreation = true;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(flexfec_needs_recreation) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    RTC_LOG(LS_INFO) << \"MaybeRecreateWebRtcFlexfecStream (recv) because of \""}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                        \"SetRecvParameters\";"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    MaybeRecreateWebRtcFlexfecStream();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(video_needs_recreation) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    RTC_LOG(LS_INFO)"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"        << \"RecreateWebRtcVideoStream (recv) because of SetRecvParameters\";"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    RecreateWebRtcVideoStream();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"}"}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"根據上面 SetRecvParameters 代碼,如果 codec_settings 不爲空、rtp_header_extensions 不爲空、flexfec_payload_type 不爲空都會重啓 VideoReceiveStream。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"video_needs_recreation 表示是否要重啓 VideoReceiveStream。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"重啓過程爲:"},{"type":"text","text":"把先前創建的釋放掉,然後重建新的 VideoReceiveStream。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"以 codec_settings 爲例:初始 video codec 支持 H264 和 VP8。若對端只支持 H264,協商後的 codec 僅支持 H264。SetRecvParameters 中的 codec_settings 爲 H264 不空。其實前後 VideoReceiveStream 的都有 H264 codec,沒有必要重建 VideoReceiveStream。可以通過配置本地支持的 video codec 初始列表和 rtp extensions,從而生成的 local SDP 和 remote SDP 中影響接收參數部分調整一致,並且判斷 codec_settings 是否相等。 如果不相等再 video_needs_recreation 爲 true。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"這樣設置就會使 SetRecvParameters 避免觸發重啓 VideoReceiveStream 邏輯。 "}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"在 debug 模式下,修改後,驗證沒有 “RecreateWebRtcVideoStream (recv) because of SetRecvParameters” 的打印, 即可證明沒有 VideoReceiveStream 重啓。"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"4.2 音頻參數調整"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"和上面的視頻調整類似,音頻也會有因爲 rtp extensions 不一致導致重新創建 AudioReceiveStream,也是釋放先前的 AudioReceiveStream,再重新創建 AudioReceiveStream。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"參考代碼:"}]},{"type":"blockquote","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"bool WebRtcVoiceMediaChannel::SetRecvParameters("}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    const AudioRecvParameters& params) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  TRACE_EVENT0(\"webrtc\", \"WebRtcVoiceMediaChannel::SetRecvParameters\");"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  RTC_DCHECK(worker_thread_checker_.CalledOnValidThread());"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  RTC_LOG(LS_INFO) << \"WebRtcVoiceMediaChannel::SetRecvParameters: \""}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                   << params.ToString();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  // TODO(pthatcher): Refactor this to be more clean now that we have"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  // all the information at once."}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" "}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(!SetRecvCodecs(params.codecs)) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    return false;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" "}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(!ValidateRtpExtensions(params.extensions)) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    return false;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  std::vector<:rtpextension> filtered_extensions = FilterRtpExtensions("}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      params.extensions, webrtc::RtpExtension::IsSupportedForAudio, false);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  if(recv_rtp_extensions_ != filtered_extensions) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    recv_rtp_extensions_.swap(filtered_extensions);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    for(auto& it : recv_streams_) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      it.second->SetRtpExtensionsAndRecreateStream(recv_rtp_extensions_);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  return true;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"}"}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"AudioReceiveStream 的構造方法會啓動音頻設備,即調用 AudioDeviceModule 的 StartPlayout。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"AudioReceiveStream 的析構方法會停止音頻設備,即調用 AudioDeviceModule 的 StopPlayout。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"因此重啓 AudioReceiveStream 會觸發多次 StartPlayout/StopPlayout。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"經測試,這些不必要的操作會導致進入視頻會議的房間時,播放的音頻有一小段間斷的情況。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"解決方法:"},{"type":"text","text":"同樣是通過配置本地支持的 audio codec 初始列表和 rtp extensions,從而生成的 local SDP 和 remote SDP 中影響接收參數部分調整一致,避免 AudioReceiveStream 重啓邏輯。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"另外 audio codec 多爲 WebRTC 內部實現,去掉一些不用的 Audio Codec,可以減小 WebRTC 對應的庫文件。"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"4.3 音視頻相互影響"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"WebRTC 內部有三個非常重要的線程:"}]},{"type":"bulletedlist","content":[{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"1)woker 線程;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"2)signal 線程;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"3)network 線程。"}]}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"調用 PeerConnection 的 API 的調用會由 signal 線程進入到 worker 線程。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"worker 線程內完成媒體數據的處理,network 線程處理網絡相關的事務,channel.h 文件中有說明,以 _w 結尾的方法爲 worker 線程的方法,signal 線程的到 worker 線程的調用是同步操作。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"如下面代碼中的 InvokerOnWorker 是同步操作,setLocalContent_w 和 setRemoteContent_w 是 worker 線程中的方法。"}]},{"type":"blockquote","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"bool BaseChannel::SetLocalContent(const MediaContentDescription* content,"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                                  SdpType type,"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                                  std::string* error_desc) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  TRACE_EVENT0(\"webrtc\", \"BaseChannel::SetLocalContent\");"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  returnI nvokeOnWorker("}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      RTC_FROM_HERE,"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      Bind(&BaseChannel::SetLocalContent_w, this, content, type, error_desc));"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"}"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" "}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"bool BaseChannel::SetRemoteContent(const MediaContentDescription* content,"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                                   SdpType type,"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                                   std::string* error_desc) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  TRACE_EVENT0(\"webrtc\", \"BaseChannel::SetRemoteContent\");"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  return InvokeOnWorker("}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      RTC_FROM_HERE,"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      Bind(&BaseChannel::SetRemoteContent_w, this, content, type, error_desc));"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"}"}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"setLocalDescription 和 setRemoteDescription 中的 SDP 信息都會通過 PeerConnection 的 PushdownMediaDescription 方法依次下發給 audio/video RtpTransceiver 設置 SDP 信息。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"舉例:"},{"type":"text","text":"執行 audio 的 SetRemoteContent_w 執行很長(比如音頻 AudioDeviceModule 的 InitPlayout 執行耗時), 會影響後面的 video SetRemoteContent_w 的設置時間。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"PushdownMediaDescription 代碼:"}]},{"type":"blockquote","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"RTCError PeerConnection::PushdownMediaDescription("}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    SdpType type,"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    cricket::ContentSource source) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  const SessionDescriptionInterface* sdesc ="}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      (source == cricket::CS_LOCAL ? local_description()"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                                   : remote_description());"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  RTC_DCHECK(sdesc);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" "}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  // Push down the new SDP media section for each audio/video transceiver."}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  for(const auto& transceiver : transceivers_) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    const ContentInfo* content_info ="}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"        FindMediaSectionForTransceiver(transceiver, sdesc);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    cricket::ChannelInterface* channel = transceiver->internal()->channel();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    if(!channel || !content_info || content_info->rejected) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      continue;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    const MediaContentDescription* content_desc ="}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"        content_info->media_description();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    if(!content_desc) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      continue;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    std::string error;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    bool success = (source == cricket::CS_LOCAL)"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                       ? channel->SetLocalContent(content_desc, type, &error)"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                       : channel->SetRemoteContent(content_desc, type, &error);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    if(!success) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"      LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, error);"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"    }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"  ..."}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"}"}]}]},{"type":"heading","attrs":{"align":null,"level":1},"content":[{"type":"text","text":"5、其他影響首幀顯示的問題"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"5.1 Android圖像寬高16字節對齊"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"AndroidVideoDecoder 是 WebRTC Android 平臺上的視頻硬解類。AndroidVideoDecoder 利用 "},{"type":"link","attrs":{"href":"http://docs.52im.net/extend/docs/api/android-50/reference/android/media/MediaCodec.html","title":null},"content":[{"type":"text","text":"MediaCodec"}]},{"type":"text","text":" API 完成對硬件解碼器的調用。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"link","attrs":{"href":"http://docs.52im.net/extend/docs/api/android-50/reference/android/media/MediaCodec.html","title":null},"content":[{"type":"text","text":"MediaCodec"}],"marks":[{"type":"strong"}]},{"type":"text","marks":[{"type":"strong"}],"text":" 有已下解碼相關的 API:"}]},{"type":"bulletedlist","content":[{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"1)dequeueInputBuffer:若大於 0,則是返回填充編碼數據的緩衝區的索引,該操作爲同步操作;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"2)getInputBuffer:填充編碼數據的 ByteBuffer 數組,結合 dequeueInputBuffer 返回值,可獲取一個可填充編碼數據的 ByteBuffer;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"3)queueInputBuffer:應用將編碼數據拷貝到 ByteBuffer 後,通過該方法告知 MediaCodec 已經填寫的編碼數據的緩衝區索引;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"4)dequeueOutputBuffer:若大於 0,則是返回填充解碼數據的緩衝區的索引,該操作爲同步操作;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"5)getOutputBuffer:填充解碼數據的 ByteBuffer 數組,結合 dequeueOutputBuffer 返回值,可獲取一個可填充解碼數據的 ByteBuffer;"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"6)releaseOutputBuffer:告訴編碼器數據處理完成,釋放 ByteBuffer 數據。"}]}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"在實踐當中發現,發送端發送的視頻寬高需要 16 字節對齊,因爲在某些 Android 手機上解碼器需要 16 字節對齊。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"大致的原理就是:"},{"type":"text","text":"Android 上視頻解碼先是把待解碼的數據通過 queueInputBuffer 給到 MediaCodec。然後通過 dequeueOutputBuffer 反覆查看是否有解完的視頻幀。若非 16 字節對齊,dequeueOutputBuffer 會有一次MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED。而不是一上來就能成功解碼一幀。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"經測試發現:"},{"type":"text","text":"幀寬高非 16 字節對齊會比 16 字節對齊的慢 100 ms 左右。"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"5.2 服務器需轉發關鍵幀請求"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"iOS 移動設備上,WebRTC App應用進入後臺後,視頻解碼由 VTDecompressionSessionDecodeFrame 返回 kVTInvalidSessionErr,表示解碼session 無效。從而會觸發觀看端的關鍵幀請求給服務器。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"這裏要求服務器必須轉發接收端發來的關鍵幀請求給發送端。若服務器沒有轉發關鍵幀給發送端,接收端就會長時間沒有可以渲染的圖像,從而出現黑屏問題。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"這種情況下只能等待發送端自己生成關鍵幀,發送個接收端,從而使黑屏的接收端恢復正常。"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"5.3 WebRTC內部的一些丟棄數據邏輯舉例"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"Webrtc從接受報數據到、給到解碼器之間的過程中也會有很多驗證數據的正確性。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"italic"},{"type":"strong"}],"text":"舉例1:"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"PacketBuffer 中記錄着當前緩存的最小的序號 first_seq_num_(這個值也是會被更新的)。 當 PacketBuffer 中 InsertPacket 時候,如果即將要插入的 packet 的序號 seq_num 小於 first_seq_num,這個 packet 會被丟棄掉。如果因此持續丟棄 packet,就會有視頻不顯示或卡頓的情況。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"italic"},{"type":"strong"}],"text":"舉例2:"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"正常情況下 FrameBuffer 中幀的 picture id,時間戳都是一直正增長的。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"如果 FrameBuffer 收到 picture_id 比最後解碼幀的 picture id 小時,分兩種情況:"}]},{"type":"bulletedlist","content":[{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"1)時間戳比最後解碼幀的時間戳大,且是關鍵幀,就會保存下來。"}]}]},{"type":"listitem","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"2)除情況 1 之外的幀都會丟棄掉。"}]}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"代碼如下: "}]},{"type":"blockquote","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"auto last_decoded_frame = decoded_frames_history_.GetLastDecodedFrameId();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" auto last_decoded_frame_timestamp ="}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     decoded_frames_history_.GetLastDecodedFrameTimestamp();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" if(last_decoded_frame && id <= *last_decoded_frame) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"   if(AheadOf(frame->Timestamp(), *last_decoded_frame_timestamp) &&"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"       frame->is_keyframe()) {"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     // If this frame has a newer timestamp but an earlier picture id then we"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     // assume there has been a jump in the picture id due to some encoder"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     // reconfiguration or some other reason. Even though this is not according"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     // to spec we can still continue to decode from this frame if it is a"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     // keyframe."}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     RTC_LOG(LS_WARNING)"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"         << \"A jump in picture id was detected, clearing buffer.\";"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     ClearFramesAndHistory();"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     last_continuous_picture_id = -1;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"   } else{"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     RTC_LOG(LS_WARNING) << \"Frame with (picture_id:spatial_id) (\""}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                         << id.picture_id << \":\""}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                         << static_cast(id.spatial_layer)"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                         << \") inserted after frame (\""}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                         << last_decoded_frame->picture_id << \":\""}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                         << static_cast(last_decoded_frame->spatial_layer)"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"                         << \") was handed off for decoding, dropping frame.\";"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"     return last_continuous_picture_id;"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"   }"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":" }"}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"因此爲了能讓收到了流順利播放,發送端和中轉的服務端需要確保視頻幀的 picture_id, 時間戳正確性。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"WebRTC 還有其他很多丟幀邏輯,若網絡正常且有持續有接收數據,但是視頻卡頓或黑屏無顯示,多爲流本身的問題。"}]},{"type":"heading","attrs":{"align":null,"level":1},"content":[{"type":"text","text":"6、本文小結"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"本文通過分析 WebRTC 音視頻接收端的處理邏輯,列舉了一些可以優化首幀顯示的點,比如通過調整 local SDP 和 remote SDP 中與影響接收端處理的相關部分,從而避免 Audio/Video ReceiveStream 的重啓。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"另外列舉了 Android 解碼器對視頻寬高的要求、服務端對關鍵幀請求處理、以及 WebRTC 代碼內部的一些丟幀邏輯等多個方面對視頻顯示的影響。 這些點都提高了融雲 SDK 視頻首幀的顯示時間,改善了用戶體驗。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"因個人水平有限,文章內容或許存在一定的侷限性,歡迎回復進行討論。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"(本文同步發佈於:"},{"type":"link","attrs":{"href":"http://www.52im.net/thread-3169-1-1.html","title":null},"content":[{"type":"text","text":"http://www.52im.net/thread-3169-1-1.html"}]},{"type":"text","text":")"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}}]}
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章