Android Camera 運行流程

1.總體架構


Android Camera 框架從整體上看是一個 client/service 的架構,

有兩個進程:

client 進程,可以看成是 AP 端,主要包括 JAVA 代碼與一些 native c/c++代碼;

service 進 程,屬於服務端,是 native c/c++代碼,主要負責和 linux kernel 中的 camera driver 交互,蒐集 linuxkernel 中 cameradriver 傳上來的數據,並交給顯示系統顯示。

client 進程與 service 進程通過 Binder 機制通信, client 端通過調用 service 端的接口實現各個具體的功能。


 

2.CameraService服務的註冊

首先既然Camera是利用binder通信,它肯定要將它的service註冊到ServiceManager裏面,以備後續Client引用,那麼這一步是在哪裏進行的呢?細心的人會發現,在frameworks\av\base\media\mediaserver\Main_MediaServer.cpp下有個main函數,可以用來註冊媒體服務。CameraService完成了服務的註冊,相關代碼如下:

int main(int argc, char** argv)
{
    sp<ProcessState> proc(ProcessState::self());
    sp<IServiceManager> sm = defaultServiceManager();
    LOGI("ServiceManager: %p", sm.get());
    AudioFlinger::instantiate();
    MediaPlayerService::instantiate();
    CameraService::instantiate();
    AudioPolicyService::instantiate();
    ProcessState::self()->startThreadPool();
    IPCThreadState::self()->joinThreadPool();
}

可是我們到CameraService文件裏面卻找不到instantiate()這個函數,它在哪?繼續追到它的一個父類BinderService,

CameraService的定義在frameworks/av/base/services/camera/libcameraservice/CameraService.h中

class CameraService :
    public BinderService<CameraService>,
    public BnCameraService
{
    class Client;
    friend class BinderService<CameraService>;
public:
    static char const* getServiceName() { return "media.camera"; }
    .....

    .....

}

從以上定義可以看出CameraService 繼承於BinderService,所以CameraService::instantiate(); 其實是調用BinderService中的instantiate。


BinderService的定義在frameworks/av/base/include/binder/BinderService.h中

// ---------------------------------------------------------------------------
namespace android {

template<typename SERVICE>
class BinderService
{
public:
    static status_t publish() {
        sp<IServiceManager> sm(defaultServiceManager());

       return sm->addService(String16(SERVICE::getServiceName()), new SERVICE());
    }
    static void publishAndJoinThreadPool() {
        sp<ProcessState> proc(ProcessState::self());
        sp<IServiceManager> sm(defaultServiceManager());
        sm->addService(String16(SERVICE::getServiceName()), new SERVICE());
        ProcessState::self()->startThreadPool();
        IPCThreadState::self()->joinThreadPool();
    }

    static void instantiate() { publish(); }

    static status_t shutdown() {
        return NO_ERROR;
    }
};

}; // namespace android
// ---------------------------------------------------------------------------
可以發現在publish()函數中,CameraService完成服務的註冊 。這裏面有個SERVICE,源碼中有說明 

template<typename SERVICE>
這表示SERVICE是個模板,這裏是註冊CameraService,所以可以用CameraService代替
return sm->addService(String16(CameraService::getServiceName()), new CameraService());
這樣,Camera就在ServiceManager完成服務註冊,提供給client隨時使用。
Main_MediaServer主函數由init.rc在啓動是調用,所以在設備開機的時候Camera就會註冊一個服務,用作binder通信。

3.client端的應用層到JNI層(Camera App--->JNI)


其調用流程圖如下:


從第2節的分析中可知,Binder服務已註冊,那接下來就看看client如何連上server端,並打開camera模塊。先從camera app的源碼入手。在onCreate()函數中專門有一個open Camera的線程。

應用層

camera app的源碼文件在以下目錄packages/apps/LegacyCamera/src/com/android/camera/camera.java
    @Override
    public void onCreate(Bundle icicle) {
        super.onCreate(icicle);
        getPreferredCameraId();
        String[] defaultFocusModes = getResources().getStringArray(
                R.array.pref_camera_focusmode_default_array);
        mFocusManager = new FocusManager(mPreferences, defaultFocusModes);


        /*
         * To reduce startup time, we start the camera open and preview threads.
         * We make sure the preview is started at the end of onCreate.
         */
        mCameraOpenThread.start();

          ................
        mCameraPreviewThread = null;
    }
再看看mCameraOpenThread

    Thread mCameraOpenThread = new Thread(new Runnable() {
        public void run() {
            try {
                mCameraDevice = Util.openCamera(Camera.this, mCameraId);
            } catch (CameraHardwareException e) {
                mOpenCameraFail = true;
            } catch (CameraDisabledException e) {
                mCameraDisabled = true;
            }
        }
    });

繼續追Util.openCamera ,Util類的定義在以下目錄:packages/apps/LegacyCamera/src/com/android/camera/Util.java
    public static android.hardware.Camera openCamera(Activity activity, int cameraId)
            throws CameraHardwareException, CameraDisabledException {
        // Check if device policy has disabled the camera.
        ...............
        try {
            return CameraHolder.instance().open(cameraId);
        } catch (CameraHardwareException e) {
            // In eng build, we throw the exception so that test tool
            // can detect it and report it
            if ("eng".equals(Build.TYPE)) {
                throw new RuntimeException("openCamera failed", e);
            } else {
                throw e;
            }
        }
    }
又來了個CameraHolder,該類用一個實例open Camera

CameraHolder的定義在以下目錄:packages/apps/LegacyCamera/src/com/android/camera/CameraHolder.java

    public synchronized android.hardware.Camera open(int cameraId)
            throws CameraHardwareException {
       ..............
        if (mCameraDevice == null) {
            try {
                Log.v(TAG, "open camera " + cameraId);
                mCameraDevice = android.hardware.Camera.open(cameraId);//進入framework層
                mCameraId = cameraId;
            } catch (RuntimeException e) {
                Log.e(TAG, "fail to connect Camera", e);
                throw new CameraHardwareException(e);
            }
            mParameters = mCameraDevice.getParameters();
        } else {
            ............
        }
        ++mUsers;
        mHandler.removeMessages(RELEASE_CAMERA);
        mKeepBeforeTime = 0;
        return mCameraDevice;
    }

其中調用frameworks\base\core\java\android\hardware\Camera.java類的open方法 ,進入Framework層。

Framework

其Framework層的open函數定義在如下文件中:frameworks\base\core\java\android\hardware\Camera.java    

    public static Camera open(int cameraId) {
        return new Camera(cameraId);
    }
這裏調用了Camera的構造函數,對Camera類的一些參數進行簡單初始化,其構造函數如下:
   Camera(int cameraId) {
        mShutterCallback = null;
        mRawImageCallback = null;
        mJpegCallback = null;
        mPreviewCallback = null;
        mPostviewCallback = null;
        mZoomListener = null;


        Looper looper;
        if ((looper = Looper.myLooper()) != null) {
            mEventHandler =new EventHandler(this, looper);
        } else if ((looper = Looper.getMainLooper()) != null) {
            mEventHandler = new EventHandler(this, looper);
        } else {
            mEventHandler = null;
        }

        native_setup(new WeakReference<Camera>(this), cameraId);  //調用JNI
   }

從這裏開始通過JNI調用到native_setup( ),這裏在系統上電時已經把JNI的一個對象註冊成類Camer的Listener。

JNI層

native_setup( )接口在libandroid_runtime.so中實現,由Framework層通過JNI調用該接口。該接口主要是實現如下兩個功能:

1、實現CameraC/S架構的客戶端和服務端的連接(通過調用connect方法,進入libcamera_client.so

2set一個監聽類,用於處理底層Camera回調函數傳來的數據和消息


native_setup()的定義在如下源文件中:frameworks/base/core/jni/android_hardware_Camera.cpp

static JNINativeMethod camMethods[] = {
  { "native_setup",
    "(Ljava/lang/Object;I)V",
    (void*)android_hardware_Camera_native_setup },

  { "startPreview",
    "()V",
    (void *)android_hardware_Camera_startPreview },

  { "native_autoFocus",
    "()V",
    (void *)android_hardware_Camera_autoFocus },

    ..................
};

通過這個定義,使得native_setup( )和android_hardware_Camera_native_setup( )關聯起來。所以,native_setup(new WeakReference<Camera>(this), cameraId);這個調用即是對下面android_hardware_Camera_native_setup( )這個函數的調用:

// connect to camera service

static voidandroid_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
    jobject weak_this, jint cameraId)
{

    sp<Camera> camera = Camera::connect(cameraId);

     ...........

    // make sure camera hardware is alive
    if (camera->getStatus() != NO_ERROR) {
        jniThrowRuntimeException(env, "Camera initialization failed");
        return;
    }
    ...........

    // We use a weak reference so the Camera object can be garbage collected.
    // The reference is only used as a proxy for callbacks.
    
sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera);
    context->incStrong(thiz);
    camera->setListener(context);


    // save context in opaque field
    env->SetIntField(thiz, fields.context, (int)context.get());

}
JNI函數裏面,我們找到Camera C/S架構的客戶端了,它調用connect函數向服務器發送連接請求。JNICameraContext這個類是一個監聽類,用於處理底層Camera回調函數傳來的數據和消息。


4.client到service的連接

Clinet端

看看客戶端的connect函數有什麼? connenct()函數的實現在libcamera_client.so中實現。它的源碼在以下路徑中:frameworks/av/camera/Camera.cpp

sp<Camera> Camera::connect(int cameraId)
{
    LOGV("connect");
    sp<Camera> c = new Camera();
    const sp<ICameraService>& cs = getCameraService();//獲取CameraService實例
    if (cs != 0) {
        c->mCamera = cs->connect(c, cameraId);
    }
    if (c->mCamera != 0) {
        c->mCamera->asBinder()->linkToDeath(c);
        c->mStatus = NO_ERROR;
    } else {
        c.clear();
    }
    return c;
}

其中通過const sp<ICameraService>& cs =getCameraService(); 獲取CameraService實例,進入getCameraService( )中。

getCameraService( )源碼文件如下:frameworks/av/camera/Camera.cpp

// establish binder interface to camera service
const sp<ICameraService>& Camera::getCameraService()
{
    Mutex::Autolock _l(mLock);
    if (mCameraService.get() == 0) {
        sp<IServiceManager> sm = defaultServiceManager();
        sp<IBinder> binder;
        ....................
        binder->linkToDeath(mDeathNotifier);
        mCameraService = interface_cast<ICameraService>(binder);
    }
    LOGE_IF(mCameraService==0, "no CameraService!?");
    return mCameraService;
}
CameraService實例通過binder獲取的,mCameraService即爲CameraService的實例。

service端

service端的實現在庫libcameraservice.so中。

回到sp<Camera> Camera::connect(int cameraId)中

c->mCamera = cs->connect(c, cameraId);
即:執行service的connect()函數,並且返回ICamera對象,賦值給Camera的mCamera,服務端connect()返回的是他內部類的一個實例。

service的connect()函數定義庫文件libcameraservice.so中實現。

connect( )源碼路:frameworks/av/services/camera/libcameraservice/CameraService.cpp

sp<ICamera> CameraService::connect(
        const sp<ICameraClient>& cameraClient, int cameraId) {
    int callingPid = getCallingPid();
    sp<CameraHardwareInterface> hardware = NULL;
    ....................

    hardware = new CameraHardwareInterface(camera_device_name);
    if (hardware->initialize(&mModule->common) != OK) {
        hardware.clear();
        return NULL;
    }


    client = new Client(this, cameraClient, hardware, cameraId, info.facing, callingPid);
    mClient[cameraId] = client;
    LOG1("CameraService::connect X");
    return client;
}
首先實例化Camera Hal接口 hardware,然後hardware調用initialize()進入HAL層打開Camear驅動。最後new Client()返回給Client端。


5.HAL層

在Service端通過調用initialize()進入HAL層打開Camear驅動。initialize()的實現在CameraHardwareInterface中,其源碼如下:
frameworks/av/services/camera/libcameraservice/CameraHardwareInterface.h

status_t initialize(hw_module_t *module)
{
        LOGI("Opening camera %s", mName.string());
        int rc = module->methods->open(module, mName.string(), (hw_device_t **)&mDevice);
        if (rc != OK) {
            LOGE("Could not open camera %s: %d", mName.string(), rc);
            return rc;
        }
        initHalPreviewWindow();
        return rc;
}
此處通過module->method->open()方法真正打開Camera設備,其中module是由它的調用者(serivce端:hardware->initialize(&mModule->common) )傳過來的參數。該module的定義在以下路徑:frameworks/av/services/camera/libcameraservice/CameraHardwareInterface.h

class CameraService :
    public BinderService<CameraService>,
    public BnCameraService
{

    class Client : public BnCamera
    {
    public:
        ......

    private:

        .....

    };

    camera_module_t *mModule;

};

此處還必須找到camera_module_t 的定,以更好的理解整個運行流程,通過追根溯源找到了camera_module_t 定義,

camera_module_t的定義在以下路徑:hardware/av/include/hardware/camera_common.h中,定義如下

typedef struct camera_module {
    hw_module_t common;
    int (*get_number_of_cameras)(void);
    int (*get_camera_info)(int camera_id, struct camera_info *info);
} camera_module_t;
其中包含get_number_of_cameras方法和get_camera_info方法用於獲取camera info

另外hw_module_t common;這個選項十分重要,此處應重點關注,因爲是使用hw_module_t結構體中的open()方法打開設備文件的

繼續找到hw_module_t 結構體的定義。在以下路徑:hardware/libhardware/include/hardware/hardware.h,代碼如下:

struct hw_module_t;
struct hw_module_methods_t;
struct hw_device_t;

/**
 * Every hardware module must have a data structure named HAL_MODULE_INFO_SYM
 * and the fields of this data structure must begin with hw_module_t
 * followed by module specific information.
 */
typedef struct hw_module_t {
    ......................
    /** Modules methods */
    struct hw_module_methods_t* methods;
     ......................
    
} hw_module_t;

同樣,找到hw_module_methods_t這個結構體的定義,代碼如下:

typedef struct hw_module_methods_t {
    /** Open a specific device */
    int (*open)(const struct hw_module_t* module, const char* id, struct hw_device_t** device);
} hw_module_methods_t;

hw_module_methods_t 結構體中只有open()一個方法,用於打開camera driver,實現與硬件層的交互。

open()是一個函數指針,對open()賦值的代碼如下:/hardware/qcom/camera/QualcommCamera.cpp

static hw_module_methods_t camera_module_methods = {

open:camera_device_open,

};


其中camera_device_open()函數調用流程如下:



上圖可知,在HAL層的module->methods->open(module, mName.string(), (hw_device_t **)&mDevice)回調,最終會調用到函數mm_camera__epen()。

int32_t mm_camera_open(mm_camera_obj_t *my_obj,  mm_camera_op_mode_type_t  op_mode)

{
      .......................................
        snprintf(dev_name, sizeof(dev_name), "/dev/%s", m_camera_util_get_dev_name(my_obj));
        do{
                n_try--;
                my_obj->ctrl_fd  = open(dev_name,O_RDWR | O_NONBLOCK);
            ...................
        }while(n_try>0);

        ....................
        return rc;
}


這個將調用系統調用open()的方法,打開設備節點dev/video0(後置相機),/dev/video2(前置相機),這個順序是和內核在啓動的是和
video的註冊順序相關的。

注意:這裏的系統調用open()函數是應用層的,它最終對應內核層(驅動)的open函數爲msm_open(),如下:

/kernel/drivers/media/video/msm/msm.c

static struct v4l2_file_operations g_msm_fops = {
     .owner   = THIS_MODULE,
     .open    = msm_open,
    .poll    = msm_poll,
    .mmap    = msm_mmap,
    .release = msm_close,
    .ioctl      = video_ioctl2,
};


msm_open()具體分析見第6節驅動層。

回到前面,設備節點dev/video0(後置相機),/dev/video2(前置相機)這個節點是在哪兒註冊的呢?

6.驅動層

該節對攝像頭驅動層進行分析,以高通平臺爲例。

/dev/目錄下的攝像頭註冊是由驅動程序完成,在函數msm_cam_dev_ini( )中實現,源碼在如下文件中:/kernel/drivers/media/video/msm/msm.c   

其中,video_reister_device()完成攝像頭節點的註冊。

pvdev->ops = &g_msm_fops,則是嚮應用層提供打開攝像頭Sensor的接口函數。

源碼如下:/kernel/drivers/media/video/msm/msm.c


HAL層的open方法最終會調用的內核驅動層的msm_open()函數。

msm_open()源碼文件爲:/kernel/drivers/media/video/msm/msm.c

static int msm_open(struct file *f)
{
    ......................

    /* Now we really have to activate the camera */
    D("%s: call mctl_open\n", __func__);
   rc = pmctl->mctl_open(pmctl, MSM_APPS_ID_V4L2);//打開相機
   if (rc < 0) {
       pr_err("%s: HW open failed rc = 0x%x\n",   __func__, rc);
       goto mctl_open_failed;
   }
   pmctl->pcam_ptr = pcam;
   ........................
   if (pcam->use_count == 1) {
       rc = msm_send_open_server(pcam);//後面進行分析
      ...............
   } 
   ........................
}


msm_send_openserver()在後面進行分析。

msm_open()調用mctl_open(),mctl_open()是真正打開相機的地方,當時它同樣是一個函數指針,對該函數指針進行賦值在文件/kernel/drivers/media/video/msm/msm_mctl.c中,如下:

/* this function plug in the implementation of a v4l2_subdev */

int msm_mctl_init(struct msm_cam_v4l2_device *pcam)
{

    ...........

    /* init module operations*/

    pmctl->mctl_open = msm_mctl_open;

pmctl->mctl_cmd = msm_mctl_cmd;

pmctl->mctl_release = msm_mctl_release;

...........

}

msm_mctl_open()的源碼在如下文件中:/kernel/drivers/media/msm/msm_mctl.c

static int msm_mctl_open(struct msm_cam_media_controller *p_mctl, const char *const apps_id)
{
..............................
/* then sensor - move sub dev later */
rc = v4l2_subdev_call(p_mctl->sensor_sdev, core, s_power, 1);
...............................
}


msm_mctl_open()函數調用v412_subdev_call( ),該函數中的s_power又是一個函數指針,通過回調調用函數msm_sensor_power(),對該函數指針的賦值操作的源碼如下:/kernel/drivers/media/video/msm/sensor/s5k4e1_v4l2.c

static struct v4l2_subdev_core_ops s5k4e1_subdev_core_ops = {
.ioctl = msm_sensor_subdev_ioctl,
.s_power = msm_sensor_power,
};


通過調用msm_sensor_power( )完成Camera的Sensor上電操作。

但是,此時camera並沒有進行初始化,只是上電並讀取ID而已,那麼sensor又是在什麼時候進行初始化的呢?繼續進行分析............


在之前打開了/dev/video0 的節點,在 msm_open( ) 函數中,最後會去調用msm_send_open_server( ),這個函數會去喚醒我們用戶空間的config 線程,msm_send_open_server( )源碼如下:/kernel/drivers/media/video/msm/msm.c

/*send open command to server*/
static int msm_send_open_server(struct msm_cam_v4l2_device *pcam)
{
     int rc = 0;
      struct msm_ctrl_cmd ctrlcmd;
     D("%s qid %d\n", __func__, pcam->server_queue_idx);
     ctrlcmd.type            = MSM_V4L2_OPEN;
     ctrlcmd.timeout_ms = 10000;
     ctrlcmd.length         = strnlen(g_server_dev.config_info.config_dev_name[0],MAX_DEV_NAME_LEN)+1;
     ctrlcmd.value        = (char *)g_server_dev.config_info.config_dev_name[0];
     ctrlcmd.vnode_id = pcam->vnode_id;
     ctrlcmd.queue_idx = pcam->server_queue_idx;
     ctrlcmd.config_ident = g_server_dev.config_info.config_dev_id[0];
     /* send command to config thread in usersspace, and get return value */
     rc = msm_server_control(&g_server_dev, &ctrlcmd);
     return rc;
}


在這個函數中我們需要注意這個timeout的時間限制,它是要求我們的請求必須在10s內完成,否則config線程就會超時,從而導致相機將無法使用,只能通過重啓來修復。

msm_server_control( )函數會向用戶空間的config線程發送指令MSM_V4L2_OPEN(即代碼中的ctrlcmd.type = SM_V4L2_OPEN),這是用戶空間的線程會被激活,調用用戶空間的函數qcamsvr_process_server_node_event( ),源碼如下:\vendor\qcom\proprietary\mm-camera\server\core\qcamsvr.c

static int qcamsvr_process_server_node_event( struct config_thread_arguments *config_arg,

                            struct msm_mctl_node_info *mctl_node_info, gesture_info_t *p_gesture_info)
{
    ..................................
    if (ctrl->type == MSM_V4L2_OPEN) {
    CDBG("MSM_V4L2_OPEN is received\n");
    snprintf(config_arg->config_name, MAX_DEV_NAME_LEN, "%s",(char  *)event_data.isp_data.ctrl.value);
    CDBG("%s: OPEN %s\n", __func__, config_arg->config_name);

    .........................
       if ((tmp_mctl_struct->handle = create_v4l2_conf_thread(config_arg)) == NULL) {

           CDBG_ERROR("%s:  create_v4l2_conf_thread  failed",  __func__);

           ctrl->status  =  CAM_CTRL_FAILED;
           v4l2_ioctl.ioctl_ptr  =  ctrl;
           goto  error_config_thread_creation;
        }
      .........................
    }
    ....................................
}


從以上代碼可知,用戶空間在這個地方收到了我們內核的請求後,接着會調用create_v4l2_conf_thread( )創建用戶空間 config 線程。

現整個過程又回到用戶空間..............

7.又見用戶空間

在/dev/目錄下有個設備節點文件./msm_camera/config0,即/dev/msm_camera/config0,這個文件的主要作用是打開它後,用它來完成Sensor的初始化。對於該節點主要關心兩個問題:

①什麼時候被創建?

    在函數msm_camera_probe(),源碼路徑爲:/kernel/drivers/media/msm/msm.c,主要是通過如下兩條語句進行創建:

msm_class = class_create(THIS_MODULE, "msm_camera");

 device_config = device_create(msm_class, NULL, devno,NULL, "%s%d",device_name, dev_num);//其中device_name = "config",dev_num = 0

②什麼時候打開?具體分析見下。


由上面的分析可知,當內核驅動向用戶空間發送SM_V4L2_OPEN請求,調用create_v4l2_conf_thread( )函數創建用戶空間 config 線程,該函數的源碼如下:\vendor\qcom\proprietary\mm-camera\server\core\mctl\mctl.c

void *create_v4l2_conf_thread(struct config_thread_arguments* arg)
{
    .................................
    rc = pthread_create(&pme->cam_mctl_thread_id, NULL, cam_mctl_thread, pme);
   ...........................
}


它創建一個線程,然後該線程執行的函數爲cam_mctl_thread( ),然後它又調用mctl_init( ),即cam_mctl_thread( ) -----> mctl_init( )。mctl_init()源碼如下:

/vendor/qcom/proprietary/mm-camera/server/core/mctl/mctl.c

int mctl_init(m_ctrl_t* pme)
{
    ..........................
    CDBG("%s: dev name is %s\n", __func__, config_device);
    p_cfg_ctrl->camfd = open(config_device, O_RDWR);
    ..............................
    mctl_proc_init_ops(p_cfg_ctrl);
   s_comp_ops = &p_cfg_ctrl->comp_ops[MCTL_COMPID_SENSOR];
   if(!s_comp_ops->handle) {        

       sensor_init_data_t sensor_init_data;
       tmp_handle = sensor_client_open(s_comp_ops);
       ...........................
    }
   ...................................
   if (s_comp_ops->init) {

        rc  = s_comp_ops->init(s_comp_ops->handle, &p_cfg_ctrl->ops, &sensor_init_data);
        .......................

    }

   ....................................

}


open()函數打開/dev/camera/config0節點後,隨後調用sensor_client_open()函數,最有調用s_comp_ops->init()函數, 該函數利用config0節點去完成sensor的初始化工作。sensor_client_open()完成函數指針_comp_ops->init()的賦值,源碼如下:

/vendor/qcom/proprietary/mm-camera/server/hardware/sensor/sensor_interface.c

uint32_t sensor_client_open(module_ops_t *ops)
{
    .....................

    ops->handle = (uint32_t)sensor_client->handle;

    ops->init = sensor_client_init;

    ops->set_params = sensor_client_set_params;

    ops->get_params = sensor_client_get_params;

    ops->process = NULL;

    ops->abort  =  NULL;

    ops->destroy= sensor_client_destroy;

    ............................

}


因此,調用s_comp_ops->init(),實際是調用的函數sensor_client_init()函數,然後該函數會調用到sensor_init(),

s_comp_ops->init() --> sensor_client_init() ---> sensor_init()。sensor_init( )的源碼如下:

/vendor/qcom/proprietary/mm-camera/server/hardware/sensor/sensor.c

int8_t sensor_init(sensor_ctrl_t *sctrl)
{
    struct msm_camsensor_info sinfo;
    ......................
    sensor_common_parm_init(sctrl);
    rc = ioctl(sctrl->sfd, MSM_CAM_IOCTL_GET_SENSOR_INFO, &sinfo);
    ........................
   sctrl->start = &sensors[cnt];
   rc = sctrl->start->s_start(sctrl);
   if (sctrl->sensor.out_data.sensor_output.output_format == SENSOR_BAYER) {
        rc = sensor_load_chromatix(sctrl);
        ........................
   }
   ........................
}


在sensor_init()函數中,利用系統調用ioctl(),獲取sensor的信息(包括sensor的類型yuv or raw,af enable ? 閃光燈類型,sensor名等)。

其中:rc = sctrl->start->s_start(sctrl);

s_start()會調用s5k4e1_process_start(),對於函數指針s_start()的賦值,是用過一個宏定義進行的,具體參照文件/vendor/qcom/proprietary/mm-camera/server/hardware/sensor/sensor.c中的源碼。

rc = sensor_load_chromatix(sctrl);

這條語句就會去加載我們的庫文件了(僅僅針對RAW sensor)。


現分析函數s5k4e1_process_start(),源碼如下:/vendor/qcom/proprietary/mm-camera/server/hardware/sensor/s5k4e1/s5k4e1_u.c

int8_t s5k4e1_process_start(void *ctrl)

{
   sensor_ctrl_t *sctrl = (sensor_ctrl_t *) ctrl;
   sctrl->fn_table = &s5k4e1_func_tbl;
   sctrl->sensor.inputformat = s5k4e1_inputformat;
   sctrl->sensor.crop_info = s5k4e1_cropinfo;
   sctrl->sensor.mode_res = s5k4e1_mode_res;
   sensor_util_get_output_info(sctrl);
   sctrl->sensor.op_mode = SENSOR_MODE_VIDEO;
   sctrl->sensor.out_data.sensor_output.connection_mode = SENSOR_MIPI_CSI;
   sctrl->sensor.out_data.sensor_output.output_format = SENSOR_BAYER;
   sctrl->sensor.out_data.sensor_output.raw_output = SENSOR_10_BIT_DIRECT;
   sctrl->sensor.out_data.aec_info.max_gain = 16.0;
   sctrl->sensor.out_data.aec_info.max_linecount = sctrl->sensor.output_info[sctrl->sensor.
   mode_res[SENSOR_MODE_PREVIEW]].frame_length_lines * 24;
   sctrl->sensor.snapshot_exp_wait_frames = 1;
   sctrl->sensor.out_data.lens_info.focal_length = 3.49;
   sctrl->sensor.out_data.lens_info.pix_size = 1.4;
   sctrl->sensor.out_data.lens_info.f_number = 2.2;
   sctrl->sensor.out_data.lens_info.total_f_dist = 1.97;
   sctrl->sensor.out_data.lens_info.hor_view_angle = 54.8;
   sctrl->sensor.out_data.lens_info.ver_view_angle = 42.5;
   sensor_util_config(sctrl);
    return TRUE;
}

其中:

sensor_util_get_output_info(sctrl);這條語句調用到內核獲取長寬等

sensor_util_config(sctrl);這條語句調用到內核完成初始化


至此,整個攝像頭open過程分析完成。

到此爲止,很容易看出:

Android中Camera的調用流程可分爲以下幾個層次:
Package->Framework->JNI->Camera(cpp)--(binder)-->CameraService->Camera HAL->Camera Driver --> 通過消息內核驅動激活用戶空間,完成Sensor初始化 ---> open完成http://blog.csdn.net/unicornkylin/article/details/13293295

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章