android虛擬攝像頭

        敲下標題的這一刻,內心還是有點兒小激動的。畢竟虛擬攝像頭,也做了幾個星期了,硬生生的在android原生系統不支持的情況下,繞過重重限制,完美的實現了這一功能。接下來幾天,終於可以睡個好覺了。

        好了,閒話少說,我們先來分析下虛擬攝像頭。一說到虛擬攝像頭,大家印象裏,肯定首先想到的是這麼一個情景:某個猥瑣男在網上正興致勃勃的撩妹,看着視頻里美女漂亮的容顏,不僅春心大動,口水流了一地。而實際上,在網絡的另一端,和這猥瑣男聊天視頻的,根本就不是視頻裏的美女,而是一位一邊摳着腳一邊抽着煙的大漢。呃,那麼問題來了,這個摳腳大漢,是怎麼變成了迷死人不償命的美女的呢?這其中原理,就是一個虛擬攝像頭。摳腳大漢手機端,通過虛擬攝像頭技術,將一段事先讓美女錄好的視頻,餵給攝像頭接口。攝像頭這時讀取的,不是真實的實時數據,而是事先錄好的這視頻裏的數據。猥瑣男那邊,展現出來的,自然也就是錄好了的視頻了。

        上面所說的,是虛擬攝像頭的一種應用情景,另外還有一種情景。那就是多個app,同時打開同一個攝像頭,預覽同一份camera數據。當然,這裏所說的多個app打開同一個攝像頭,不是指在同一個app裏,創建多個surface,讓多個surface顯示同一份攝像頭數據。在同一個app裏將camera數據分發到不同的surface上,這很簡單,原生的camera框架就支持,具體的可以參考我之前寫的那篇博客。

        而要讓多個app,同時打開同一個攝像頭,顯示同一份數據,這就有些麻煩了。首先,camera框架,不允許同時有多個app使用camera,哪怕是多個app打開不同id的攝像頭。當然,這個限制,也很好破解。我們可以將services/camera/libcameraservice/utils/ClientManager.h這個文件裏的ClientManager構造函數,做如下修改:

//這是原生的,mMaxCost表示最多同時可以有多少個app打開camera
template<class KEY, class VALUE, class LISTENER>
ClientManager<KEY, VALUE, LISTENER>::ClientManager(int32_t totalCost) : mMaxCost(totalCost) {}


//解決多進程打開camera衝突的問題, *10表示最多可以同時打開10個攝像頭
template<class KEY, class VALUE, class LISTENER>
ClientManager<KEY, VALUE, LISTENER>::ClientManager(int32_t totalCost) : mMaxCost(totalCost*10) {}

        上面這樣改的原理是,我們在打開一個camera的時候,會調用到CameraService.cpp裏的connectHelper函數,在這個函數裏,會調用handleEvictionsLocked這個函數去檢測當前有無衝突。再跟進handleEvictionsLocked這個函數,會發現它又調用了mActiveClientManager.wouldEvict這個函數,當這個函數返回有衝突的client後,緊接着,就會調用下面的代碼,將這個client給disconnect掉。

    for (auto& i : evictedClients) {
        // Disconnect is blocking, and should only have returned when HAL has cleaned up
        i->getValue()->disconnect(); // Clients will remove themselves from the active client list
    }

        wouldEvict這個函數定義在frameworks\av\services\camera\libcameraservice\utils\ClientManager.h裏,它調用了wouldEvictLocked這個函數。在這個函數裏,會去判斷當前調用的app的優先級,以及當前打開的camera的個數。

    for (const auto& i : mClients) {
        const KEY& curKey = i->getKey();
        int32_t curCost = i->getCost();
        ClientPriority curPriority = i->getPriority();
        int32_t curOwner = i->getOwnerId();

        bool conflicting = (curKey == key || i->isConflicting(key) ||
                client->isConflicting(curKey));
         //下面如果強制寫成false,則cameraservice::handleEvictionsLocked函數裏,不會再去強制檢測camera是否已經打開過
        conflicting = false;          
        ALOGD("wouldEvictLocked totalCost=%" PRId64 ", mMaxCost=%d, conflicting=%d", totalCost, mMaxCost, conflicting);
        if (!returnIncompatibleClients) {
            // Find evicted clients

            if (conflicting && curPriority < priority) {
                // Pre-existing conflicting client with higher priority exists
                evictList.clear();
                evictList.push_back(client);
                return evictList;
            } else if (conflicting || ((totalCost > mMaxCost && curCost > 0) &&
                    (curPriority >= priority) &&
                    !(highestPriorityOwner == owner && owner == curOwner))) {
                // Add a pre-existing client to the eviction list if:
                // - We are adding a client with higher priority that conflicts with this one.
                // - The total cost including the incoming client's is more than the allowable
                //   maximum, and the client has a non-zero cost, lower priority, and a different
                //   owner than the incoming client when the incoming client has the
                //   highest priority.
                evictList.push_back(i);
                totalCost -= curCost;
            }
        } else {
            // Find clients preventing the incoming client from being added

            if (curPriority < priority && (conflicting || (totalCost > mMaxCost && curCost > 0))) {
                // Pre-existing conflicting client with higher priority exists
                evictList.push_back(i);
            }
        }
    }

        當如果打開的個數,大於了mMaxCost,就會將當前的app當成有衝突的,push到衝突list裏去,然後在cameraService.cpp裏給disconnect掉。

        好了,這個多個app打開camera的個數的限制,就先講到這裏了。這個網上有很多更詳細的資料,大家可以去參考一下。

        現在解決了第一個問題,讓多app,可以同時打開不同的攝像頭了。要想打開同一個攝像頭,還有問題。如果是同一個攝像頭,且當前的app的優先級,低於已經打開的app的優先級,就會將當前的這個app給關掉。如果當前的優先級高於已經打開的app,那麼就會將已經打開的app裏的camera給關掉。這也在上面這段代碼裏可以體現出來,我們直接將conflicting寫成false即可。

        將conflicting寫成false後,cameraService不再會去檢測當前的camera id是否已經打開過,它會繼續往下走,會去調用makeClient。最終在打開真實攝像頭時,由hal層上報一個已經打開過的錯誤。

        爲了解決這個問題,讓多應用打開同一個camera,就必須讓hal層也不報這個錯。但是如果按現有流程走下去,調用open函數打開camera,肯定是不行的。明擺着嘛,已經打開了一次,怎麼能再打開一次呢?那麼這個問題,該怎麼去處理呢?

        答案就是——虛擬攝像頭。

        我們可以在hal層,新增一個camera的hal模塊出來,它定義了camera_module_t

camera_module_t HAL_MODULE_INFO_SYM = {
    .common = {
        .tag                = HARDWARE_MODULE_TAG,
        .module_api_version = CAMERA_MODULE_API_VERSION_2_3,
        .hal_api_version    = HARDWARE_HAL_API_VERSION,
        //.id                 = CAMERA_HARDWARE_MODULE_ID, 
        .id                 = "virtual_camera",
        .name               = "virtual_camera", 
        .author             = "Antmicro Ltd.",
        .methods            = &android::HalModule::moduleMethods,
        .dso                = NULL,
        .reserved           = {0}
    },
    .get_number_of_cameras  = android::HalModule::getNumberOfCameras,
    .get_camera_info        = android::HalModule::getCameraInfo,
    .set_callbacks          = android::HalModule::setCallbacks,
};

        它實現了open函數

static struct hw_module_methods_t moduleMethods = {
    .open = openDevice
};
static int openDevice(const hw_module_t *module, const char *name, hw_device_t **device) {
    ALOGI("%s: lihb openDevice, name=%s", __FUNCTION__, name);
    if (module != &HAL_MODULE_INFO_SYM.common) {
        ALOGI("%s: invalid module (%p != %p)", __FUNCTION__, module, &HAL_MODULE_INFO_SYM.common);
        return -EINVAL;
    }
    if (name == NULL) {
        ALOGI("%s: NULL name", __FUNCTION__);
        return -EINVAL;
    }
    errno = 0;
    int cameraId = (int)strtol(name, NULL, 10);
    ALOGI("%s: cameraId: %d, getNumberOfCameras: %d", __FUNCTION__, cameraId, getNumberOfCameras());
    if(errno || cameraId < 0 || cameraId >= getNumberOfCameras()) {
        ALOGI("%s: invalid camera ID (%s)", __FUNCTION__, name);
        return -EINVAL;
    }
    if(!cams[cameraId]->isValid()) {
        ALOGI("%s: camera %d is not initialized", __FUNCTION__, cameraId);
        *device = NULL;
        return -ENODEV;
    }

    return cams[cameraId]->openDevice(device);
}

        它實現了getCameraInfo、setCallbacks等標準的hal接口,它的camera類繼承於camera3_device,它實現了camera hal3的所有標準接口:

class Camera: public camera3_device {
public:
    Camera();
    virtual ~Camera();

    bool isValid() { return mValid; }

    virtual status_t cameraInfo(struct camera_info *info);

    virtual int openDevice(hw_device_t **device);
    virtual int closeDevice();
    void YV12ToI420(uint8_t *YV12,char *I420, int w,int h);
protected:
    virtual camera_metadata_t * staticCharacteristics();
    virtual int initialize(const camera3_callback_ops_t *callbackOps);
    virtual int configureStreams(camera3_stream_configuration_t *streamList);
    virtual const camera_metadata_t * constructDefaultRequestSettings(int type);
    virtual int registerStreamBuffers(const camera3_stream_buffer_set_t *bufferSet);
    virtual int processCaptureRequest(camera3_capture_request_t *request);

    /* HELPERS/SUBPROCEDURES */

    void notifyShutter(uint32_t frameNumber, uint64_t timestamp);
    void processCaptureResult(uint32_t frameNumber, const camera_metadata_t *result, const Vector<camera3_stream_buffer> &buffers);

    camera_metadata_t *mStaticCharacteristics;
    camera_metadata_t *mDefaultRequestSettings[CAMERA3_TEMPLATE_COUNT];
    CameraMetadata mLastRequestSettings;

    bool mValid;
    const camera3_callback_ops_t *mCallbackOps;

    size_t mJpegBufferSize;

private:
    ImageConverter mConverter;
    Mutex mMutex;
    uint8_t* mFrameBuffer;
    uint8_t* rszbuffer;

    /* STATIC WRAPPERS */

    static int sClose(hw_device_t *device);
    static int sInitialize(const struct camera3_device *device, const camera3_callback_ops_t *callback_ops);
    static int sConfigureStreams(const struct camera3_device *device, camera3_stream_configuration_t *stream_list);
    static int sRegisterStreamBuffers(const struct camera3_device *device, const camera3_stream_buffer_set_t *buffer_set);
    static const camera_metadata_t * sConstructDefaultRequestSettings(const struct camera3_device *device, int type);
    static int sProcessCaptureRequest(const struct camera3_device *device, camera3_capture_request_t *request);
    static void sGetMetadataVendorTagOps(const struct camera3_device *device, vendor_tag_query_ops_t* ops);
    static void sDump(const struct camera3_device *device, int fd);
    static int sFlush(const struct camera3_device *device);

    static camera3_device_ops_t sOps;
};

}; /* namespace android */

        總之,大家可以把它當成一個真實的camera,只不過它的open函數,不是去打開真實的物理攝像頭,而是去mmap一塊共享內存。它的processCaptureRequest,不是從真實的攝像頭裏實時的取數據,而是從這塊共享內存裏取數據。我們暫且將這個虛擬攝像頭叫做virtual camera。這個virtual camera的邏輯,大致就是這樣,hal的接口都是標準的,只是喂的數據,是從共享內存裏拿過來的。而這塊共享內存裏的數據,又是誰餵給它的呢?答案是——真實的已經打開了的攝像頭。

        我目前是在mtk 6762 8.1平臺上調試的,在mtk平臺上,camera hal裏,camera出來的數據,經過3A之類的處理後,會先發給DisplayClient.BufOps.cpp這個文件的handleReturnBuffers。我們可以在這個文件裏,通過uint8_t * srcBuf = (uint8_t *)pStreamImgBuf->getVirAddr(),來將當前幀的數據給取出來,然後寫到共享內存裏去。

        當然,其他的平臺,比如高通、展迅等,都是一個原理。找到hal層數據出來的地方,寫到共享內存裏,然後在virtual camera裏的processCaptureRequest裏給取出來。這樣virtual camera就可得到跟真實camera同樣的數據了。

        不過,真實的camera,從hal層出來的數據,都是yuv格式的,比如我這個平臺的這個攝像頭,出來的原始數據就是yv12格式的。在virtual camera裏的processCaptureRequest裏,不能直接將它丟給process_capture_result回調函數。因爲camera預覽數據,是要ABGR格式的。因爲android的surface上顯示的數據,它的buffer,實際上也就是GraphicBuffer。系統的GraphicBuffer的默認格式,一般都是rgb的。所以我們在從共享內存裏取到數據後,還需要轉換一下行才。

        而yv12,不能直接轉成rgb的,要先轉成i420的,下面附上轉換函數:

/*
I420: YYYYYYYY UU VV    =>YUV420P
YV12: YYYYYYYY VV UU    =>YUV420P
yv12轉I420,只需要將它們的u、v分量調換位置即可
*/
void Camera::YV12ToI420(uint8_t *YV12,char *I420, int w,int h)
{
    memcpy(I420, YV12, w*h);//y分量
    memcpy(I420+w*h, YV12+w*h+w*h/4, w*h/4);//V分量
    memcpy(I420+w*h+w*h/4, YV12+w*h, w*h/4);//u分量
}

        上面轉成了i420後,就可以調用libyuv的I420ToABGR函數了。這裏轉出來的數據,是可以直接送到surface上顯示的。至於共享內存這一塊,在android8.0以後是怎麼處理的,大家也可以參照我之前的博客。

        總的來說,需要添加一個多應用打開同一個camera的虛擬攝像頭,需要處理的有下面幾大塊:

        1.)在cameraService裏,去掉多應用打開攝像頭(不限於同一個camera id)的限制

        2.)新增共享內存hidl服務模塊

        3.)新增一個虛擬攝像頭的hal(詳細過程,參照我之前寫的新增一個攝像頭的博客)

        4.)在真實的camera hal裏,將從驅動讀取到的數據,拷到共享內存裏去

        5.)在virtual camera的processCaptureRequest裏,將數據讀出來,並轉換成對應的rgba格式的。

        做完上面的這些工作後,新增虛擬攝像頭的主要工作,就做完了。不過還有很多細節,需要一一處理。

        比如,我們新增了virtual camera,但是app不知道,我們只告訴app開發者,你可以用多個app同時打開同一個camera id了。至於virtual camera 在hal層和framework是怎麼處理的,app開發者並不知道,也不應該讓他們來關心這些。他們需要做的只有一點,即在app1裏,Camera.open(0);這樣打開id爲0的攝像頭。同時,有可能在app2裏,也調用Camera.open(0);,打開同一個攝像頭。甚至還有可能在app3、app4........裏,都調用Camera.open(0);去打開id爲0的同一個攝像頭。

        那麼,問題來了。都是用的同一個id來打開的攝像頭,framework層,怎麼去調到我們的virtual camera呢?這就需要在connectHelper裏處理了,我們需要在這個函數裏,調用handleEvictionsLocked這個函數之前,加上下面一段代碼:

            auto current = mActiveClientManager.get(cameraId);
            if (current != nullptr) 
            {
                char c_camera_ref[PROPERTY_VALUE_MAX] = {'\0'};
                int  i_camera_ref = 0;
                property_get(YOV_VIRTUAL_CAMERA_REF, c_camera_ref, "0");
                i_camera_ref = atoi(c_camera_ref);

                cameraId = "2";
                auto tempClient = mActiveClientManager.get(cameraId);
                if(tempClient != nullptr)
                {
                    cameraId = "3";
                }
            }

        我調試的平臺上,有前後兩個攝像頭,所以我的virtual camera,它的id是從2開始的(因爲mtk camera hal是用的hal1,我的virtual camera hal是hal3。所以在處理hal1和hal3對應的camera個數時,它們是互相不知道對方有多少個數的,我這裏得將它寫死),如果想要有多個虛擬攝像頭,它們的id就會往下順延2、3、4........

        上面這段代碼的意思是,如果當前的camera id是已經打開過的,那麼,我就將id悄悄的改爲id爲2的虛擬攝像頭。如果id爲2的virtual camera也打開了,那麼我們就打開id爲3的virtual camera。當然,如果我們有10個、20個虛擬攝像頭,都可以按上面的邏輯來處理。

        我們每打開一個camera,在connectHelper裏每makeClient出來一個client,都緊接着調用了finishConnectLocked(client, partial);這個函數,將這個client,通過finishConnectLocked函數裏的mActiveClientManager.addAndEvict(clientDescriptor);       這行代碼,給添加到了 mActiveClientManager這個list裏。這正是我們通過它來判斷一個id對應的camera有沒有被打開的關鍵。

        到這裏,我們基本上能做到兩個app同時打開一個camera了。但是,如果主app(我們這裏將第一個打開camera,打開了真實的camera的app稱做主app)退出了,從app(我們將第二個、第三個打開已經打開過的id的camera的,調到了virtual camera 的app稱做從app)會立馬停止預覽。因爲主app退出時,會從hal層,將真實的camera給關掉。真實的camera給關掉了,virtual camera從共享內存裏拿不到新的數據,自然會停止預覽了。

        爲了解決這個問題,我們需要引入“引用計數”的概念。首先,我們需要在真實的camera打開的地方,將計數加1.mtk6762平臺上,camera hal1對應的打開的地方是vendor\mediatek\proprietary\hardware\mtkcam\main\hal\device\1.x\device\CameraDevice1Base.cpp。我們在這個文件的open函數裏,如下給對應的camera id做引用計數處理:

Return<Status>
CameraDevice1Base::
open(const sp<ICameraDeviceCallback>& callback)
{
    ........
    //真實設備只可能打開一次,計數只能爲0或1
    if(mInstanceId == 0)
    {
        property_set(CAMERA0_REF, "1");
    }
    else if(mInstanceId == 1)
    {
        property_set(CAMERA1_REF, "1");
    }
    ........
}

           在close裏,如下做引用計數遞減處理:

Return<void>
CameraDevice1Base::
close()
{
    ........
    if(mInstanceId == 0)
    {
        property_set(CAMERA0_REF, "0");
    }
    else if(mInstanceId == 1)
    {
        property_set(CAMERA1_REF, "0");
    }
    ........
}

         在virtual camera的open函數裏,做如下引用計數處理:

int Camera::open(hw_device_t **device) 
{
    ........
    //虛擬攝像頭,可以打開多個,所以引用計數可以一直增加
    char c_camera_ref[PROPERTY_VALUE_MAX] = {'\0'};
    int  i_camera_ref = 0;
    property_get(VIRTUAL_CAMERA_REF, c_camera_ref, "0");
    i_camera_ref = atoi(c_camera_ref);    
    ++i_camera_ref;
    memset(c_camera_ref, '\0', PROPERTY_VALUE_MAX);
    sprintf(c_camera_ref, "%d", i_camera_ref);
    property_set(VIRTUAL_CAMERA_REF, c_camera_ref);
    ........
}

        這裏要注意, virtual camera的減計數,不能放在virtual camera的close函數裏. 因爲我們呆待在cameraService.cpp裏,還要用到它.        

        然後,在調用真實camera的app退出,調用camera.stopPreview()、camera.release()後,需要用上面的引用計數來攔截對底層camera驅動的關閉釋放操作。mtk 6762 8.1上,用到的camera hal是hal1。無論你用的camera api是1還是2,makeclient時,都是make的CameraClient這一個。app對camera的操作,最終都會經過這裏。所以我們攔截時,就可以在這裏下手。

        比如當用戶調用stopPreview時,我們需要在frameworks\av\services\camera\libcameraservice\api1\CameraClient.cpp裏做如下處理:

// stop preview mode
void CameraClient::stopPreview() {
    LOG1("stopPreview (pid %d), mCameraId=%d", getCallingPid(), mCameraId);
    Mutex::Autolock lock(mLock);

    if(mCameraId != 2 && mCameraId != 3)
    {
        //add by xuhui
        char c_camera_ref[PROPERTY_VALUE_MAX] = {'\0'};
        int  i_camera_ref = 0;
        property_get(VIRTUAL_CAMERA_REF, c_camera_ref, "0");
        i_camera_ref = atoi(c_camera_ref);

        //當前virtual camera已經打開,這時不能關閉。要等virtual camera先關掉,再來關這裏。
        if(i_camera_ref > 0)
        {
            ALOGE("CameraClient::stopPreview, virtual camera exist, don't stopPreview");
            return;
        }
    }

    
    if (checkPidAndHardware() != NO_ERROR) return;


    disableMsgType(CAMERA_MSG_PREVIEW_FRAME);
    mHardware->stopPreview();
    sCameraService->updateProxyDeviceState(
        hardware::ICameraServiceProxy::CAMERA_STATE_IDLE,
        mCameraIdStr, mCameraFacing, mClientPackageName);
    mPreviewBuffer.clear();
}

        然後在用戶調用release時,需要在disconnect裏做如下處理:

binder::Status CameraClient::disconnect() {
    int callingPid = getCallingPid();
    Mutex::Autolock lock(mLock); 
    binder::Status res = binder::Status::ok(); 
    if(mCameraId != 2 && mCameraId != 3 )
    {
        //add by xuhui
        sp<CameraClient> client = NULL;
        char c_camera0_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera1_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera_virtual_ref[PROPERTY_VALUE_MAX] = {'\0'};
        int  i_camera_virtual_ref = 0;
        int  i_camera0_ref = 0;
        int  i_camera1_ref = 0;
        property_get(CAMERA0_REF, c_camera0_ref, "0");
        i_camera0_ref = atoi(c_camera0_ref);
        property_get(CAMERA1_REF, c_camera1_ref, "0");
        i_camera1_ref = atoi(c_camera1_ref);
        property_get(VIRTUAL_CAMERA_REF, c_camera_virtual_ref, "0");
        i_camera_virtual_ref = atoi(c_camera_virtual_ref);    
        if(i_camera_virtual_ref > 0)
        {
            ALOGE("CameraClient::disconnect, mCameraId=%d, virtual camera exist, don't disconnect", mCameraId);
            //-1表示正常的camera想退出,但是虛擬攝像頭打開了,暫時不能退出,等虛擬攝像頭關閉時,再退出
            if(mCameraId == 0)
            {
                property_set(CAMERA0_REF, "-1");
            }
            else if(mCameraId == 1)
            {
                property_set(AMERA1_REF, "-1");
            }
            return res;
        }
        
    }        
#endif
    // Allow both client and the cameraserver to disconnect at all times
    if (callingPid != mClientPid && callingPid != mServicePid) {
        ALOGE("different client - don't disconnect");
        //這裏如果不屏掉,那麼就不能通過其他進程來關閉之前打開的camera
       // return res;
    }
    // Make sure disconnect() is done once and once only, whether it is called
    // from the user directly, or called by the destructor.
    if (mHardware == 0) return res;

    LOG1("CameraClient::disconnect, hardware teardown");
    // Before destroying mHardware, we must make sure it's in the
    // idle state.
    // Turn off all messages.
    disableMsgType(CAMERA_MSG_ALL_MSGS);
//!++
    disableMsgType(MTK_CAMERA_MSG_ALL_MSGS);
//!--
    mHardware->stopPreview();
    sCameraService->updateProxyDeviceState(
            hardware::ICameraServiceProxy::CAMERA_STATE_IDLE,
            mCameraIdStr, mCameraFacing, mClientPackageName);
    mHardware->cancelPicture();
    // Release the hardware resources.
    mHardware->release();
    // Release the held ANativeWindow resources.
    if (mPreviewWindow != 0) {
        disconnectWindow(mPreviewWindow);
        mPreviewWindow = 0;
        mHardware->setPreviewWindow(mPreviewWindow);
    }
    mHardware.clear();

    CameraService::Client::disconnect();

    LOG1("CameraClient::disconnect end,  (pid %d)", callingPid);

    return res;
}

        這裏是將真實的camera對hal層的stopPreview和release給攔截掉了,那麼它該在什麼地方去真正的釋放呢?按常理來講,主app都已經退出了,比對失去了對主app裏面創建的camera對象的控制權,其他的app更加沒有權限去操作它。那麼該怎麼讓真實的app真正的退出呢?

        不知道大家還記不記得上面說過的,在cameraService.cpp裏,每makeclient出來一個client,都被保存到了mActiveClientManager這個裏面。這個make出來的client,是一個BasicClient,是cameraClient的祖父類。所以在一個camera被關閉的時候,一定會走到BasicClient的disconnect。我們真正關閉真實camera的地方,就在這個函數裏。

        

binder::Status CameraService::BasicClient::disconnect() {
    binder::Status res = Status::ok();
        
    if (mDisconnected) {
        return res;
    }
    mDisconnected = true;

    sCameraService->removeByClient(this);
    sCameraService->logDisconnected(mCameraIdStr, mClientPid,
            String8(mClientPackageName));

    sp<IBinder> remote = getRemote();
    if (remote != nullptr) {
        remote->unlinkToDeath(sCameraService);
    }

    finishCameraOps();
    // Notify flashlight that a camera device is closed.
    sCameraService->mFlashlight->deviceClosed(mCameraIdStr);
    ALOGI("%s: Disconnected client for camera %s for PID %d", __FUNCTION__, mCameraIdStr.string(),
            mClientPid);

    // client shouldn't be able to call into us anymore
    mClientPid = 0;

    
        int id = cameraIdToInt(mCameraIdStr);
    
        if(id == 2 || id == 3)
        {
            //add by xuhui
            sp<CameraService::BasicClient> client = NULL;
            
            char c_camera0_ref[PROPERTY_VALUE_MAX] = {'\0'};
            char c_camera1_ref[PROPERTY_VALUE_MAX] = {'\0'};
            char c_camera_virtual_ref[PROPERTY_VALUE_MAX] = {'\0'};
            int  i_camera_virtual_ref = 0;
            int  i_camera0_ref = 0;
            int  i_camera1_ref = 0;
            
            property_get(CAMERA0_REF, c_camera0_ref, "0");
            i_camera0_ref = atoi(c_camera0_ref);
            
            property_get(CAMERA1_REF, c_camera1_ref, "0");
            i_camera1_ref = atoi(c_camera1_ref);
    
            property_get(VIRTUAL_CAMERA_REF, c_camera_virtual_ref, "0");
            i_camera_virtual_ref = atoi(c_camera_virtual_ref);  
            --i_camera_virtual_ref;
            if(i_camera_virtual_ref < 0)
            {
                i_camera_virtual_ref = 0;
            }
            sprintf(c_camera_virtual_ref, "%d", i_camera_virtual_ref);
            property_set(VIRTUAL_CAMERA_REF, c_camera_virtual_ref);
    
            //-1表示調用真實攝像頭的app調用了close,這時如果虛擬攝像頭的計數爲0了,表示
            //沒有人引用虛擬攝像頭了,那麼真實攝像頭也可以關了。
            if(i_camera_virtual_ref == 0)
            {        
                if(i_camera0_ref == -1)
                {
                    String8 cameraId0("0");
                    client = sCameraService->mActiveClientManager.getCameraClient(cameraId0);
                }
                else if(i_camera1_ref == -1)
                {
                    String8 cameraId1("1");
                    client = sCameraService->mActiveClientManager.getCameraClient(cameraId1);
                }      
                if(client != NULL)
                {
                    client->disconnect();
                }
            }
        }            

    return res;
}

        通過上面的這個函數,就可以將真實的virtual camera給關了。

        到這一步,真實的camera和virtual camera可以同時打開了,並且真實的攝像頭在關閉時,並不會關閉真實的攝像頭的驅動,而是等到虛擬攝像頭被關閉時,纔去關閉它。

        但是現在這樣還夠完善,比如,主app退出時,調用了windowManager.removeView(surfaceView);之類的,將主app裏的攝像頭對應的surface給移除了的話,在hal層,就會因爲沒有了ANativeWindow,也即surface,沒有了它對應的dequeue_buffer、enqueue_buffer等函數的調用,而導致camera hal的數據無法取出。體現在app上也就是預覽界面突然黑屏了,或者定格在上一幀不同了。體現在log上,會報如下的錯:

05-25 18:22:20.292  3542 15151 E BufferQueueProducer: [SurfaceTexture-1-3542-0](this:0xc35b5000,id:1,api:4,p:442,c:-1) queueBuffer: BufferQueue has been abandoned
05-25 18:22:20.293   442   663 E Surface : queueBuffer: error queuing buffer to SurfaceTexture, -19
05-25 18:22:20.293   461 16924 E MtkCam/DisplayClient: (16924)[enquePrvOps] mpStreamOps->enqueue_buffer failed: status[Function not implemented(38)], rpImgBuf(0xdd60c510,0xd5bbd000) (enquePrvOps){#589:vendor/mediatek/proprietary/hardware/mtkcam/middleware/v1/client/DisplayClient/DisplayClient.Stream.cpp}

        這裏順便說一下,這個app上的surface和mtk camera hal層的關係。app上的surface,對應的就是DisplayClient.h裏的mpStreamOps,它聲明在DisplayClient.h裏

preview_stream_ops*             mpStreamOps;

        preview_stream_ops的結構定義如下:

typedef struct preview_stream_ops {
    int (*dequeue_buffer)(struct preview_stream_ops* w,
                          buffer_handle_t** buffer, int *stride);
    int (*enqueue_buffer)(struct preview_stream_ops* w,
                buffer_handle_t* buffer);
    int (*cancel_buffer)(struct preview_stream_ops* w,
                buffer_handle_t* buffer);
    int (*set_buffer_count)(struct preview_stream_ops* w, int count);
    int (*set_buffers_geometry)(struct preview_stream_ops* pw,
                int w, int h, int format);
    int (*set_crop)(struct preview_stream_ops *w,
                int left, int top, int right, int bottom);
    int (*set_usage)(struct preview_stream_ops* w, int usage);
    int (*set_swap_interval)(struct preview_stream_ops *w, int interval);
    int (*get_min_undequeued_buffer_count)(const struct preview_stream_ops *w,
                int *count);
    int (*lock_buffer)(struct preview_stream_ops* w,
                buffer_handle_t* buffer);
    // Timestamps are measured in nanoseconds, and must be comparable
    // and monotonically increasing between two frames in the same
    // preview stream. They do not need to be comparable between
    // consecutive or parallel preview streams, cameras, or app runs.
    int (*set_timestamp)(struct preview_stream_ops *w, int64_t timestamp);
} preview_stream_ops_t;

        它通過DisplayClient.cpp裏的setWindow調用DisplayClient.Stream.cpp文件中的set_preview_stream_ops來將app上傳下來的surface傳下來設置給mpStreamOps。如果一層層的追上去,我們可以從CameraClient.cpp裏的setPreviewTarget函數看起。

// set the buffer consumer that the preview will use
status_t CameraClient::setPreviewTarget(
        const sp<IGraphicBufferProducer>& bufferProducer) {
    sp<IBinder> binder;
    sp<ANativeWindow> window;     
    if (bufferProducer != 0) {
        binder = IInterface::asBinder(bufferProducer);
        window = new Surface(bufferProducer, /*controlledByApp*/ true);
    }
    return setPreviewWindow(binder, window);
}

status_t CameraClient::setPreviewWindow(const sp<IBinder>& binder,
        const sp<ANativeWindow>& window) {
    ......
    result = mHardware->setPreviewWindow(window);
    ......
}


        然後,如果在app上將surface給移除後,會在DisplayClient.Stream.cpp裏的dequePrvOps函數裏,調用err = mpStreamOps->dequeue_buffer(mpStreamOps, &phBuffer, &stride);這一行時報錯,取不到buff。然後影響到上一層的DisplayClient.BufOps.cpp文件裏的DisplayClient::prepareOneTodoBuffer這個函數,在這裏調用dequePrvOps(pStreamImgBuf)會報錯。再然後會影響到DisplayClient.BufOps.cpp文件裏的DisplayClient::prepareAllTodoBuffers函數,在prepareOneTodoBuffer(rpBufQueue) 時報錯,再接着會影響到DisplayClient.BufOps.cpp文件裏的DisplayClient::onThreadLoop函數的死循環。結果是在這個死循環裏,一直取不到數據,導致預覽黑屏。

        要解決這個問題,就需要讓app上的這個surface不要釋放。但這是這可能的,即使app上不顯示的將它釋放掉,在app退出一段時間後,系統也會將退出了的進程佔用的資源給釋放掉。所以要另想其他的辦法。

        所謂的其他辦法,也很簡單。重新new一個Surface給hal層,就可以了。

void CameraClient::setNewPreviewWindow()
{
    ALOGE("CameraClient::setNewPreviewWindow() start");
    BufferQueue::createBufferQueue(&mNewProducer, &mNewConsumer);
    ALOGE("CameraClient::setNewPreviewWindow() 1");    
    GLuint texName;
    glGenTextures(1, &texName);
    mNewSurfaceTexture = new GLConsumer(mNewConsumer, texName,
                    GL_TEXTURE_EXTERNAL_OES, true, true);
    if (mNewSurfaceTexture == 0) {
        ALOGE("CameraClient::setNewPreviewWindow() 2, Unable to create native SurfaceTexture");
        return;
    }
    ALOGE("CameraClient::setNewPreviewWindow() 3"); 
    mNewSurfaceTexture->setName(String8::format("SurfaceTexture-%d-%d-%d",
            texName,
            getpid(),
            createProcessUniqueId()));    
                    
    sp<SurfaceTextureListener> stListener(new SurfaceTextureListener());
    stListener->mtListener=stListener;
    mNewSurfaceTexture->setFrameAvailableListener(stListener);  

    //下面這一行,看可不可以不需要
    mNewSurfaceTexture->setDefaultBufferSize(1280, 720);    
    bool useAsync = false;
    status_t res;
    int32_t consumerUsage;
    if ((res = mNewProducer->query(NATIVE_WINDOW_CONSUMER_USAGE_BITS,
            &consumerUsage)) != OK) {
        ALOGE("CameraClient::setNewPreviewWindow() 5: Camera : Failed to query mNewConsumer usage");
        return;
    }
    if (consumerUsage & GraphicBuffer::USAGE_HW_TEXTURE) {
        ALOGE("CameraClient::setNewPreviewWindow() 6: Camera : Forcing asynchronous mode for stream");
        useAsync = true;
    }
    ALOGE("CameraClient::setNewPreviewWindow() 7"); 
    mNewSurface = new Surface(mNewProducer, useAsync);
    
    //ANativeWindow *anw = surface.get();

    ALOGE("CameraClient::setNewPreviewWindow() 8");  
    sp<IBinder> binder;
  //  sp<ANativeWindow> window;     
    if (mNewProducer != 0) {
        ALOGE("CameraClient::setNewPreviewWindow() 9"); 
        binder = IInterface::asBinder(mNewProducer);
    }
    setPreviewWindow(binder, mNewSurface);    
    ALOGE("CameraClient::setNewPreviewWindow() end"); 
}

        這個函數在哪裏調呢,在CameraClient::disconnect()裏,當app release camera的時候,我們手動再設置一個surface下去,就可以了。

binder::Status CameraClient::disconnect() {
    ......
    if(mCameraId != 2 && mCameraId != 3 )
    {
        //add by xuhui
        sp<CameraClient> client = NULL;
        
        char c_camera0_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera1_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera_virtual_ref[PROPERTY_VALUE_MAX] = {'\0'};
        int  i_camera_virtual_ref = 0;
        int  i_camera0_ref = 0;
        int  i_camera1_ref = 0;
        ALOGE("CameraClient::disconnect 3");
        property_get(CAMERA0_REF, c_camera0_ref, "0");
        i_camera0_ref = atoi(c_camera0_ref);
        
        property_get(CAMERA1_REF, c_camera1_ref, "0");
        i_camera1_ref = atoi(c_camera1_ref);

        ALOGE("CameraClient::disconnect 4");
        property_get(VIRTUAL_CAMERA_REF, c_camera_virtual_ref, "0");
        i_camera_virtual_ref = atoi(c_camera_virtual_ref);    
        ALOGE("CameraClient::disconnect, i_camera0_ref=%d, i_camera1_ref=%d, , i_camera_virtual_ref=%d", 
               i_camera0_ref, i_camera1_ref, i_camera_virtual_ref);
        if(i_camera_virtual_ref > 0)
        {
            ALOGE("CameraClient::disconnect, mCameraId=%d, virtual camera exist, don't disconnect", mCameraId);
            //-1表示正常的camera想退出,但是虛擬攝像頭打開了,輥時不能退出,等虛擬攝像頭關閉時,再退出
            if(mCameraId == 0)
            {
                property_set(CAMERA0_REF, "-1");
            }
            else if(mCameraId == 1)
            {
                property_set(CAMERA1_REF, "-1");
            }
            ALOGE("CameraClient::disconnect setNewPreviewWindow");
            setNewPreviewWindow();
            return res;
        }
        
    }    
    ......   
}

        好了,多app打開同一個攝像頭的方案,到此就全部講完了。現在我這邊可以同時讓一個app在後臺錄相,在前臺同時有一個app同開打一個攝像頭了。

        對camera有興趣的同學,可以繼續交流交流。

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章