Android的App一般是用java寫的,大致流程如下:
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setOnCompletionListener(new OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer mp){
mediaPlayer.release();
mediaPlayer = null;
}
});
mediaPlayer.setDataSource("abc.mp3");
mediaPlayer.setDisplay();
mediaPlayer.prepare();
mediaPlayer.start();
我們就按照這個流程來一步一步分析整個播放流程。
(一)Media Server的啓動流程:
我們知道,Android是基於Linux內核的,而在Linux中,啓動的第一個進程就是init進程,其他進程都是init進程的子進程。在init進程的啓動過程中,會解析Linux的配置腳本init.rc文件,根據init.rc文件的內容,Init進程會裝載Android的文件系統,創建系統目錄,啓動Android系統重要的守護進程,這些進程包括USB守護進程,adb守護進程,vold守護進程等。
同時,init進程還會啓動Media Server(多媒體服務),ServiceManager(Binder服務管家)等重要服務。
init進程還會孵化出Zygote進程,Zygote進程是Android系統的首個Java進程,Zygote是所有Java進程的父進程,如圖所示:
在Android m6.0之前的版本中,mediaserver服務的啓動腳本命令在system/core/rootdir/init.rc文件中:
在Android N7.0後,mediaserver服務的啓動腳本遷移到system/core/rootdir/init.zygote64.rc文件中:
mediaserver啓動後,會把media相關的一些服務添加到ServiceManager中,其中就有MediaPlayerService,ResourceManagerService等等。mediaserver可以理解爲所有有關Media相關的服務器,它爲app所提供服務。
那麼再來看看MediaPlayerService在Android Framework中所處的位置:
這個mediaserver的核心文件就是frameworks/av/media/mediaserver/main_mediaserver.cpp:
int main(int argc __unused, char **argv __unused)
{
signal(SIGPIPE, SIG_IGN);
sp<ProcessState> proc(ProcessState::self());
sp<IServiceManager> sm(defaultServiceManager());
ALOGI("ServiceManager: %p", sm.get());
InitializeIcuOrDie();
MediaPlayerService::instantiate();
ResourceManagerService::instantiate();
registerExtensions();
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}
標紅的代碼就是MediaPlayerService的初始化代碼,它位於frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp中:
void MediaPlayerService::instantiate() {
defaultServiceManager()->addService(
String16("media.player"), new MediaPlayerService());
}
向ServiceManager中註冊了一個實名Binder:media.player。
(二)MediaPlayer的創建過程
在App中,我們執行了:MediaPlayer mediaPlayer = new MediaPlayer();
那麼就來看看這個過程,在 frameworks/base/media/java/android/MediaPlayer.java中:
public MediaPlayer() {
super(new AudioAttributes.Builder().build());
Looper looper;
if ((looper = Looper.myLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else if ((looper = Looper.getMainLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else {
mEventHandler = null;
}
mTimeProvider = new TimeProvider(this);
mOpenSubtitleSources = new Vector<InputStream>();
/* Native setup requires a weak reference to our object.
* It's easier to create it here than in C++.
*/
native_setup(new WeakReference<MediaPlayer>(this));
}
在它的構造函數中,將java中創建的MediaPlayer通過弱引用傳遞給JNI層,而在它的構造函數之前,MediaPlayer類有一段靜態代碼塊,加載了media_jni.so庫,用於JNI相關的初始化:
static {
System.loadLibrary("media_jni");
native_init();
}
private static native final void native_init();
這裏調用了本地方法native_init(),它的實現位於 frameworks/base/media/jni/android_media_MediaPlayer.cpp中,但是,並不能單純通過名字直接找到這個native_init()方法,它通過一個結構體數組做了映射:
static const JNINativeMethod gMethods[] = {
{
"nativeSetDataSource",
"(Landroid/os/IBinder;Ljava/lang/String;[Ljava/lang/String;"
"[Ljava/lang/String;)V",
(void *)android_media_MediaPlayer_setDataSourceAndHeaders
},
{"_setDataSource", "(Ljava/io/FileDescriptor;JJ)V", (void *)android_media_MediaPlayer_setDataSourceFD},
{"_setDataSource", "(Landroid/media/MediaDataSource;)V",(void *)android_media_MediaPlayer_setDataSourceCallback },
{"_setVideoSurface", "(Landroid/view/Surface;)V", (void *)android_media_MediaPlayer_setVideoSurface},
...
{"native_init", "()V", (void *)android_media_MediaPlayer_native_init},
{"native_setup", "(Ljava/lang/Object;)V", (void *)android_media_MediaPlayer_native_setup},
...
}
這個結構體數組幾乎映射了所有MediaPlayer類的方法,以後有類似的方法可以通過這個數組找到,下面繼續看native_init()方法:
static void
android_media_MediaPlayer_native_init(JNIEnv *env)
{
jclass clazz;
clazz = env->FindClass("android/media/MediaPlayer");
if (clazz == NULL) {
return;
}
fields.context = env->GetFieldID(clazz, "mNativeContext", "J");
if (fields.context == NULL) {
return;
}
fields.post_event = env->GetStaticMethodID(clazz, "postEventFromNative",
"(Ljava/lang/Object;IIILjava/lang/Object;)V");
if (fields.post_event == NULL) {
return;
}
fields.surface_texture = env->GetFieldID(clazz, "mNativeSurfaceTexture", "J");
if (fields.surface_texture == NULL) {
return;
}
env->DeleteLocalRef(clazz);
.........
}
這裏native_init()函數就執行完畢了,它設置了一些java層的方法。然後就是MediaServer的構造函數中的native_setup()函數,它同樣位於這個文件中:
static void
android_media_MediaPlayer_native_setup(JNIEnv *env, jobject thiz, jobject weak_this)
{
ALOGV("native_setup");
sp<MediaPlayer> mp = new MediaPlayer();
if (mp == NULL) {
jniThrowException(env, "java/lang/RuntimeException", "Out of memory");
return;
}
// create new listener and give it to MediaPlayer
sp<JNIMediaPlayerListener> listener = new JNIMediaPlayerListener(env, thiz, weak_this);
mp->setListener(listener);
// Stow our new C++ MediaPlayer in an opaque field in the Java object.
setMediaPlayer(env, thiz, mp);
}
這裏創建了一個C++層的MediaPlayer,還設置了一些Listener回調,這個模式和Android的Looper機制差不多,都是java層一個Looper,C++層也有一個Looper。
通過上面的步驟,就發現,從java層想要生成一個MediaPlayer,最終會在C++層中生成一個MediaPlayer()類,至此,MediaPlayer的構造就完成了。
(三)SetDataSource的過程(setDataSource - 1)
在Android App中:mediaPlayer.setDataSource("abc.mp3");
繼續在frameworks/base/media/jni/android_media_MediaPlayer.cpp文件中通過那個映射數組找到對應的方法(我們以播放本地文件爲例,所以對應的方法就是:
android_media_MediaPlayer_setDataSourceFD):
static void
android_media_MediaPlayer_setDataSourceFD(JNIEnv *env, jobject thiz, jobject fileDescriptor, jlong offset, jlong length)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
if (mp == NULL ) {
jniThrowException(env, "java/lang/IllegalStateException", NULL);
return;
}
if (fileDescriptor == NULL) {
jniThrowException(env, "java/lang/IllegalArgumentException", NULL);
return;
}
int fd = jniGetFDFromFileDescriptor(env, fileDescriptor);
ALOGV("setDataSourceFD: fd %d", fd);
process_media_player_call( env, thiz, mp->setDataSource(fd, offset, length), "java/io/IOException", "setDataSourceFD failed." );
}
首先獲取C++層中剛創建的MediaPlayer,然後調用process_media_player_call()來執行MediaPlayer的setDataSource函數並檢查返回狀態(process_media_player_call()函數這裏就不展開了,可以自己看,主要是做了一些錯誤和異常檢測工作,然後notify出去相應的錯誤狀態)。
具體MediaPlayer的setDataSource函數做了什麼工作,我們下節再分析,這裏只需要知道,java層的setDataSource方法最終調用了C++層MediaPlayer類的setDataSource方法就行。
(四)SetDisplay的過程
在Android App中:mediaPlayer.setDisplay();
在java層:
public void setDisplay(SurfaceHolder sh) {
mSurfaceHolder = sh;
Surface surface;
if (sh != null) {
surface = sh.getSurface();
} else {
surface = null;
}
_setVideoSurface(surface);
updateSurfaceScreenOn();
}
private native void _setVideoSurface(Surface surface);
最終會調用到jni層的android_media_MediaPlayer_setVideoSurface()函數,同樣在frameworks/base/media/jni/android_media_MediaPlayer.cpp中:
static void
android_media_MediaPlayer_setVideoSurface(JNIEnv *env, jobject thiz, jobject jsurface)
{
setVideoSurface(env, thiz, jsurface, true /* mediaPlayerMustBeAlive */);
}
static void
setVideoSurface(JNIEnv *env, jobject thiz, jobject jsurface, jboolean mediaPlayerMustBeAlive)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz); //獲取C++層的MediaPlayer
if (mp == NULL) {
if (mediaPlayerMustBeAlive) {
jniThrowException(env, "java/lang/IllegalStateException", NULL);
}
return;
}
decVideoSurfaceRef(env, thiz);
sp<IGraphicBufferProducer> new_st;
if (jsurface) { //獲取java層的surface
sp<Surface> surface(android_view_Surface_getSurface(env, jsurface));
if (surface != NULL) {
new_st = surface->getIGraphicBufferProducer();//獲取IGraphicBufuferProducer
if (new_st == NULL) {
jniThrowException(env, "java/lang/IllegalArgumentException",
"The surface does not have a binding SurfaceTexture!");
return;
}
new_st->incStrong((void*)decVideoSurfaceRef);
} else {
jniThrowException(env, "java/lang/IllegalArgumentException",
"The surface has been released");
return;
}
}
env->SetLongField(thiz, fields.surface_texture, (jlong)new_st.get());
// This will fail if the media player has not been initialized yet. This
// can be the case if setDisplay() on MediaPlayer.java has been called
// before setDataSource(). The redundant call to setVideoSurfaceTexture()
// in prepare/prepareAsync covers for this case.
mp->setVideoSurfaceTexture(new_st);
}
這裏主要是對圖像現實的surface進行保存,然後將舊的IGraphicBufferProducer強引用減一,再獲得新的IGraphicBufferProducer,最終會調用C++層的Media Player的setVideoSurfaceTexture將它摺紙進去。
IGraphicBufferProducer是SurfaceFlinger中的內容,一個UI完全現實到display的過程,SurfaceFlinger扮演着重要的角色,但是它的職責是"Flinger",即把所有應用程序最終的繪圖結果進行“混合疊圖”,然後統一繪製到物理屏幕上。在這個繪圖過程中,需要BufferQueue的參與,它是每個應用程序“一對一”的輔導老師,知道着UI程序的“畫板申請”,“作畫流程”等一系列細節,同時BufferQueue也是IGraphicBufferProducer的服務端,app通過IGraphicBufferProducer來與BufferQueue溝通。
(五)後面的prepare和start過程是直接與C++層MediaPlayer相關了,後面再分析他們。
小結:
本節只是分析了APP層與C++層MediaPlayer的構建過程,讓大家對於這個播放流程有個大致的認識和理解。