Chinaunix首页 | 论坛 | 博客
  • 博客访问: 1138042
  • 博文数量: 254
  • 博客积分: 1242
  • 博客等级: 少尉
  • 技术积分: 1581
  • 用 户 组: 普通用户
  • 注册时间: 2012-05-03 21:49
文章分类

全部博文(254)

文章存档

2017年(16)

2016年(4)

2013年(94)

2012年(140)

分类: Android平台

2013-11-19 15:19:45

Android Camera OMX方式Preview完整过程分析


在之前的文章中已经说过
OMXCameraAdapter的初始化了,为了更好的了解A9Ducati的数据交互过程,这里很有必要深入研究一下Camera采用OMX方式的Preview过程

这里还是废话一点吧,最近编辑好博客,发布时总会出现莫名其妙的问题,害的我辛辛苦苦编辑好的文章,无缘无故就消失了,郁闷啊!现在转做在word下编辑了,本来发布博客就是为了保存自己的学习思路,同时大家能互相学习,ChinaUnix加油改进吧!愿望越做越好.

不多说了,开始了,嘻嘻

这里我们还是从CameraHal开始我们对preview过程的分析吧,因为hal层的preview方法对整个preview过程做了一些很重要的初始化,看看代码吧

/**

   @brief Start preview mode.

   @param none

   @return NO_ERROR Camera switched to VF mode

   @todo Update function header with the different errors that are possible

 */

status_t CameraHal::startPreview() {

    LOG_FUNCTION_NAME;

    // When tunneling is enabled during VTC, startPreview happens in 2 steps:

    // When the application sends the command CAMERA_CMD_PREVIEW_INITIALIZATION,

    // cameraPreviewInitialization() is called, which in turn causes the CameraAdapter

    // to move from loaded to idle state. And when the application calls startPreview,

    // the CameraAdapter moves from idle to executing state.

    //

    // If the application calls startPreview() without sending the command

    // CAMERA_CMD_PREVIEW_INITIALIZATION, then the function cameraPreviewInitialization()

    // AND startPreview() are executed. In other words, if the application calls

    // startPreview() without sending the command CAMERA_CMD_PREVIEW_INITIALIZATION,

    // then the CameraAdapter moves from loaded to idle to executing state in one shot.

    下面调用的这个方法是我们关注的重点,他实现了很多preview开始前的初始化

    status_t ret = cameraPreviewInitialization();

    // The flag mPreviewInitializationDone is set to true at the end of the function

    // cameraPreviewInitialization(). Therefore, if everything goes alright, then the

    // flag will be set. Sometimes, the function cameraPreviewInitialization() may

    // return prematurely if all the resources are not available for starting preview.

    // For example, if the preview window is not set, then it would return NO_ERROR.

    // Under such circumstances, one should return from startPreview as well and should

    // not continue execution. That is why, we check the flag and not the return value.

    if (!mPreviewInitializationDone) return ret;

    // Once startPreview is called, there is no need to continue to remember whether

    // the function cameraPreviewInitialization() was called earlier or not. And so

    // the flag mPreviewInitializationDone is reset here. Plus, this preserves the

    // current behavior of startPreview under the circumstances where the application

    // calls startPreview twice or more.

    mPreviewInitializationDone = false;

    ///Enable the display adapter if present, actual overlay enable happens when we post the buffer

    这里判断我们是否使用overlay方法,如果我们使用overlay方法,则这里会开启DisplayAdapter显示

if(mDisplayAdapter.get() != NULL) {

        CAMHAL_LOGDA("Enabling display");

        int width, height;

        mParameters.getPreviewSize(&width, &height);

#if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS

        ret = mDisplayAdapter->enableDisplay(width, height, &mStartPreview);

#else

        ret = mDisplayAdapter->enableDisplay(width, height, NULL);

#endif

        if ( ret != NO_ERROR ) {

            CAMHAL_LOGEA("Couldn't enable display");

            // FIXME: At this stage mStateSwitchLock is locked and unlock is supposed to be called

            //        only from mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW)

            //        below. But this will never happen because of goto error. Thus at next

            //        startPreview() call CameraHAL will be deadlocked.

            //        Need to revisit mStateSwitch lock, for now just abort the process.

            CAMHAL_ASSERT_X(false,

                "At this stage mCameraAdapter->mStateSwitchLock is still locked, "

                "deadlock is guaranteed");

            goto error;

        }

    }

    ///Send START_PREVIEW command to adapter

    CAMHAL_LOGDA("Starting CameraAdapter preview mode");

    这里调用通过sendCommand实现真正的调用底层开始preview

    ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW);

    if(ret!=NO_ERROR) {

        CAMHAL_LOGEA("Couldn't start preview w/ CameraAdapter");

        goto error;

    }

    CAMHAL_LOGDA("Started preview");

    mPreviewEnabled = true;

    mPreviewStartInProgress = false;

    return ret;

    error:

        CAMHAL_LOGEA("Performing cleanup after error");

        //Do all the cleanup

        freePreviewBufs();

        mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);

        if(mDisplayAdapter.get() != NULL) {

            mDisplayAdapter->disableDisplay(false);

        }

        mAppCallbackNotifier->stop();

        mPreviewStartInProgress = false;

        mPreviewEnabled = false;

        LOG_FUNCTION_NAME_EXIT;

        return ret;

}

我们现在就重点看看preview开始之前到底都做了哪些初始化吧

/**

   @brief Set preview mode related initialization

          -> Camera Adapter set params

          -> Allocate buffers

          -> Set use buffers for preview

   @param none

   @return NO_ERROR

   @todo Update function header with the different errors that are possible

 */

status_t CameraHal::cameraPreviewInitialization()

{

    status_t ret = NO_ERROR;

    CameraAdapter::BuffersDescriptor desc;

    CameraFrame frame;

    unsigned int required_buffer_count;

    unsigned int max_queueble_buffers;

#if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS

        gettimeofday(&mStartPreview, NULL);

#endif

    LOG_FUNCTION_NAME;

    if (mPreviewInitializationDone) {

        return NO_ERROR;

    }

    if ( mPreviewEnabled ){

      CAMHAL_LOGDA("Preview already running");

      LOG_FUNCTION_NAME_EXIT;

      return ALREADY_EXISTS;

    }

    这里首先设置camera使用的基本参数,通过setParameters实现,容易理解

    if ( NULL != mCameraAdapter ) {

      ret = mCameraAdapter->setParameters(mParameters);

    }

    /* The first time you startPreview  --add by wang,hai-tao*/

    这里是我添加的注释,这里这个mPreviewStartInProgress表示camera preview是否正在进行,false则表示不在进行,mDisplayPaused表示camera已经开始显示,只是暂时停止了,这两个状态的检查表明这里是第一次调用preview,初次使用要查询camera匹配的分辨率,所以这里查询获得宽和高,同时保持在外面的全局变量中,以备之后使用

    if ((mPreviewStartInProgress == false) && (mDisplayPaused == false)){

      ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_RESOLUTION_PREVIEW,( int ) &frame);

      if ( NO_ERROR != ret ){

        CAMHAL_LOGEB("Error: CAMERA_QUERY_RESOLUTION_PREVIEW %d", ret);

        return ret;

      }

      ///Update the current preview width and height

      mPreviewWidth = frame.mWidth;

      mPreviewHeight = frame.mHeight;

    }

    这里我们没有设置preview callback同时也没有使能display adapter,那么我们既没有使用VL4CameraAdapter方式,也没有使用overlay方式,那么OMX方式就是我们唯一的选择了,所以这里让组件进入到Excuting state

    ///If we don't have the preview callback enabled and display adapter,

    if(!mSetPreviewWindowCalled || (mDisplayAdapter.get() == NULL)){

      CAMHAL_LOGD("Preview not started. Preview in progress flag set");

      mPreviewStartInProgress = true;

      ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_SWITCH_TO_EXECUTING);

      if ( NO_ERROR != ret ){

        CAMHAL_LOGEB("Error: CAMERA_SWITCH_TO_EXECUTING %d", ret);

        return ret;

      }

      return NO_ERROR;

    }

    这里判断我们使用overlay方式,但是这里其实只是暂停了preview,这里做的工作只是从新开启preview,并且开始preview callback

    if( (mDisplayAdapter.get() != NULL) && ( !mPreviewEnabled ) && ( mDisplayPaused ) )

        {

        CAMHAL_LOGDA("Preview is in paused state");

        mDisplayPaused = false;

        mPreviewEnabled = true;

        if ( NO_ERROR == ret )

            {

            ret = mDisplayAdapter->pauseDisplay(mDisplayPaused);

            if ( NO_ERROR != ret )

                {

                CAMHAL_LOGEB("Display adapter resume failed %x", ret);

                }

            }

        //restart preview callbacks

        if(mMsgEnabled & CAMERA_MSG_PREVIEW_FRAME)

        {

            mAppCallbackNotifier->enableMsgType (CAMERA_MSG_PREVIEW_FRAME);

        }

        signalEndImageCapture();

        return ret;

        }

    获取到属性中的指定的buffer count

    required_buffer_count = atoi(mCameraProperties->get(CameraProperties::REQUIRED_PREVIEW_BUFS));

    ///Allocate the preview buffers

    ret = allocPreviewBufs(mPreviewWidth, mPreviewHeight, mParameters.getPreviewFormat(), required_buffer_count, max_queueble_buffers);

    if ( NO_ERROR != ret )

        {

        CAMHAL_LOGEA("Couldn't allocate buffers for Preview");

        goto error;

        }

    这里其实我一直想不清楚这个MeasurementEnable到底是哪个功能的flag,暂认为是测试数据专用回调吧

    if ( mMeasurementEnabled )

        {

        这里先获取分辨率中的长度

        ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_BUFFER_SIZE_PREVIEW_DATA,

                                          ( int ) &frame,

                                          required_buffer_count);

        if ( NO_ERROR != ret )

            {

            return ret;

            }

         ///Allocate the preview data buffers

        ret = allocPreviewDataBufs(frame.mLength, required_buffer_count);

        if ( NO_ERROR != ret ) {

            CAMHAL_LOGEA("Couldn't allocate preview data buffers");

            goto error;

           }

        if ( NO_ERROR == ret )

            {

            desc.mBuffers = mPreviewDataBuffers;

            desc.mOffsets = mPreviewDataOffsets;

            desc.mFd = mPreviewDataFd;

            desc.mLength = mPreviewDataLength;

            desc.mCount = ( size_t ) required_buffer_count;

            desc.mMaxQueueable = (size_t) required_buffer_count;

上面通过desc这个变量打包我们的数据,他是BuffersDescriptor类型的变量,也就是buffer属性之类的包,然后调用sendCommand,使用自己申请好的buffer,这里其实是我看这个初始化的重点,当然还有后面的一个sendCommand         mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW_DATA,

                                        ( int ) &desc);

            }

        }

    ///Pass the buffers to Camera Adapter

    desc.mBuffers = mPreviewBuffers;

    desc.mOffsets = mPreviewOffsets;

    desc.mFd = mPreviewFd;

    desc.mLength = mPreviewLength;

    desc.mCount = ( size_t ) required_buffer_count;

    desc.mMaxQueueable = (size_t) max_queueble_buffers;

    还有就是这里的这个sendCommand

    ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW,

                                      ( int ) &desc);

    if ( NO_ERROR != ret )

        {

        CAMHAL_LOGEB("Failed to register preview buffers: 0x%x", ret);

        freePreviewBufs();

        return ret;

        }

    mAppCallbackNotifier->startPreviewCallbacks(mParameters, mPreviewBuffers, mPreviewOffsets, mPreviewFd, mPreviewLength, required_buffer_count);

    ///Start the callback notifier

    ret = mAppCallbackNotifier->start();

    if( ALREADY_EXISTS == ret )

        {

        //Already running, do nothing

        CAMHAL_LOGDA("AppCallbackNotifier already running");

        ret = NO_ERROR;

        }

    else if ( NO_ERROR == ret ) {

        CAMHAL_LOGDA("Started AppCallbackNotifier..");

        mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);

        }

    else

        {

        CAMHAL_LOGDA("Couldn't start AppCallbackNotifier");

        goto error;

        }

    if (ret == NO_ERROR) mPreviewInitializationDone = true;

    return ret;

    error:

        CAMHAL_LOGEA("Performing cleanup after error");

        //Do all the cleanup

        freePreviewBufs();

        mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);

        if(mDisplayAdapter.get() != NULL)

            {

            mDisplayAdapter->disableDisplay(false);

            }

        mAppCallbackNotifier->stop();

        mPreviewStartInProgress = false;

        mPreviewEnabled = false;

        LOG_FUNCTION_NAME_EXIT;

        return ret;

}

这里我们还是分析一下下面这个方法的实现

ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_SWITCH_TO_EXECUTING);

这个调用最终调用到BaseCameraAdapter下的sendCommand然后调用到OMXCameraAdapter下的方法switchToExecuting,这个方法的实现在下面

status_t OMXCameraAdapter::switchToExecuting()

{

  status_t ret = NO_ERROR;

  TIUTILS::Message msg;

  LOG_FUNCTION_NAME;

  mStateSwitchLock.lock();

  msg.command = CommandHandler::CAMERA_SWITCH_TO_EXECUTING;

  msg.arg1 = mErrorNotifier;

  ret = mCommandHandler->put(&msg);

  LOG_FUNCTION_NAME_EXIT;

  return ret;

}

这个方法只是给mCommandHandler发送了一个消息,这个消息告诉CommandHandler处理CAMERA_SWITCH_TO_EXECUTING这条命令而已,我们看看mCommandHandler中的handle是怎么处理我们的命令的

case CommandHandler::CAMERA_SWITCH_TO_EXECUTING:

            {

                stat = mCameraAdapter->doSwitchToExecuting();

                break;

            }

我们看看这个方法的实现

status_t OMXCameraAdapter::doSwitchToExecuting()

{

  status_t ret = NO_ERROR;

  OMX_ERRORTYPE eError = OMX_ErrorNone;

  LOG_FUNCTION_NAME;

  if ( (mComponentState == OMX_StateExecuting) || (mComponentState == OMX_StateInvalid) ){

    CAMHAL_LOGDA("Already in OMX_Executing state or OMX_StateInvalid state");

    mStateSwitchLock.unlock();

    return NO_ERROR;

  }

  if ( 0 != mSwitchToExecSem.Count() ){

    CAMHAL_LOGEB("Error mSwitchToExecSem semaphore count %d", mSwitchToExecSem.Count());

    goto EXIT;

  }

  ///Register for Preview port DISABLE  event

  ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

                         OMX_EventCmdComplete,

                         OMX_CommandPortDisable,

                         mCameraAdapterParameters.mPrevPortIndex,

                         mSwitchToExecSem);

  if ( NO_ERROR != ret ){

    CAMHAL_LOGEB("Error in registering Port Disable for event %d", ret);

    goto EXIT;

  }

  ///Disable Preview Port

  eError = OMX_SendCommand(mCameraAdapterParameters.mHandleComp,

                           OMX_CommandPortDisable,

                           mCameraAdapterParameters.mPrevPortIndex,

                           NULL);

  ret = mSwitchToExecSem.WaitTimeout(OMX_CMD_TIMEOUT);

  if (ret != NO_ERROR){

    CAMHAL_LOGEB("Timeout PREVIEW PORT DISABLE %d", ret);

  }

  CAMHAL_LOGVB("PREV PORT DISABLED %d", ret);

  ///Register for IDLE state switch event

  ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

                         OMX_EventCmdComplete,

                         OMX_CommandStateSet,

                         OMX_StateIdle,

                         mSwitchToExecSem);

  if(ret!=NO_ERROR)

    {

      CAMHAL_LOGEB("Error in IDLE STATE SWITCH %d", ret);

      goto EXIT;

    }

  eError = OMX_SendCommand (mCameraAdapterParameters.mHandleComp ,

                            OMX_CommandStateSet,

                            OMX_StateIdle,

                            NULL);

  GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

  ret = mSwitchToExecSem.WaitTimeout(OMX_CMD_TIMEOUT);

  if (ret != NO_ERROR){

    CAMHAL_LOGEB("Timeout IDLE STATE SWITCH %d", ret);

    goto EXIT;

  }

  mComponentState = OMX_StateIdle;

  CAMHAL_LOGVB("OMX_SendCommand(OMX_StateIdle) 0x%x", eError);

  ///Register for EXECUTING state switch event

  ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

                         OMX_EventCmdComplete,

                         OMX_CommandStateSet,

                         OMX_StateExecuting,

                         mSwitchToExecSem);

  if(ret!=NO_ERROR)

    {

      CAMHAL_LOGEB("Error in EXECUTING STATE SWITCH %d", ret);

      goto EXIT;

    }

  eError = OMX_SendCommand (mCameraAdapterParameters.mHandleComp ,

                            OMX_CommandStateSet,

                            OMX_StateExecuting,

                            NULL);

  GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

  ret = mSwitchToExecSem.WaitTimeout(OMX_CMD_TIMEOUT);

  if (ret != NO_ERROR){

    CAMHAL_LOGEB("Timeout EXEC STATE SWITCH %d", ret);

    goto EXIT;

  }

  mComponentState = OMX_StateExecuting;

  CAMHAL_LOGVB("OMX_SendCommand(OMX_StateExecuting) 0x%x", eError);

  mStateSwitchLock.unlock();

  LOG_FUNCTION_NAME_EXIT;

  return ret;

 EXIT:

  CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

  performCleanupAfterError();

  mStateSwitchLock.unlock();

  LOG_FUNCTION_NAME_EXIT;

  return (ret | ErrorUtils::omxToAndroidError(eError));

}

上面一连串做了三件事情:

1、  disable preview port,注册事件处理通知,等待组件返回处理通知

2、  转换状态到IDLE STATE,注册事件处理通知,等待组件返回处理通知

3、  转换状态到EXCUTING STATE,注册事件处理通知,等待组件返回处理通知

接下来重点看一下,我们自己申请了buffer,看看怎么通知底层使用我们的buffer而不要从新申请buffer,这个调用最会调用到底层的useBuffer方法,直接看看这个方法的实现吧

///API to give the buffers to Adapter这注释说的很清楚了

status_t OMXCameraAdapter::useBuffers(CameraMode mode, CameraBuffer * bufArr, int num, size_t length, unsigned int queueable)

{

    OMX_ERRORTYPE eError = OMX_ErrorNone;

    status_t ret = NO_ERROR;

    LOG_FUNCTION_NAME;

    switch(mode)

        {

        case CAMERA_PREVIEW:

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mPrevPortIndex].mNumBufs =  num;

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mPrevPortIndex].mMaxQueueable = queueable;

            ret = UseBuffersPreview(bufArr, num);

            break;

        case CAMERA_IMAGE_CAPTURE:

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mImagePortIndex].mNumBufs = num;

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mImagePortIndex].mMaxQueueable = queueable;

            ret = UseBuffersCapture(bufArr, num);

            break;

        case CAMERA_VIDEO:

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mVideoPortIndex].mNumBufs =  num;

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mVideoPortIndex].mMaxQueueable = queueable;

            ret = UseBuffersRawCapture(bufArr, num);

            break;

        case CAMERA_MEASUREMENT:

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mMeasurementPortIndex].mNumBufs = num;

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mMeasurementPortIndex].mMaxQueueable = queueable;

            ret = UseBuffersPreviewData(bufArr, num);

            break;

        case CAMERA_REPROCESS:

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mVideoInPortIndex].mNumBufs = num;

            mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mVideoInPortIndex].mMaxQueueable = queueable;

            ret = UseBuffersReprocess(bufArr, num);

            break;

        }

    LOG_FUNCTION_NAME_EXIT;

    return ret;

}

这个看看UseBufferPreview这个方法

status_t OMXCameraAdapter::UseBuffersPreview(CameraBuffer * bufArr, int num)

{

    status_t ret = NO_ERROR;

    OMX_ERRORTYPE eError = OMX_ErrorNone;

    int tmpHeight, tmpWidth;

    LOG_FUNCTION_NAME;

    if(!bufArr)

        {

        CAMHAL_LOGEA("NULL pointer passed for buffArr");

        LOG_FUNCTION_NAME_EXIT;

        return BAD_VALUE;

        }

    OMXCameraPortParameters * mPreviewData = NULL;

    OMXCameraPortParameters *measurementData = NULL;

    mPreviewData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mPrevPortIndex];

    measurementData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mMeasurementPortIndex];

    mPreviewData->mNumBufs = num ;

    if ( 0 != mUsePreviewSem.Count() )

        {

        CAMHAL_LOGEB("Error mUsePreviewSem semaphore count %d", mUsePreviewSem.Count());

        LOG_FUNCTION_NAME_EXIT;

        return NO_INIT;

        }

    if(mPreviewData->mNumBufs != num)

        {

        CAMHAL_LOGEA("Current number of buffers doesnt equal new num of buffers passed!");

        LOG_FUNCTION_NAME_EXIT;

        return BAD_VALUE;

        }

    mStateSwitchLock.lock();

    这里其实我不是很理解下面这些方法都干了些什么,不过不重要,暂且跳过

    if ( mComponentState == OMX_StateLoaded ) {

        if (mPendingPreviewSettings & SetLDC) {

            mPendingPreviewSettings &= ~SetLDC;

            ret = setLDC(mIPP);

            if ( NO_ERROR != ret ) {

                CAMHAL_LOGEB("setLDC() failed %d", ret);

            }

        }

        if (mPendingPreviewSettings & SetNSF) {

            mPendingPreviewSettings &= ~SetNSF;

            ret = setNSF(mIPP);

            if ( NO_ERROR != ret ) {

                CAMHAL_LOGEB("setNSF() failed %d", ret);

            }

        }

        if (mPendingPreviewSettings & SetCapMode) {

            mPendingPreviewSettings &= ~SetCapMode;

            ret = setCaptureMode(mCapMode);

            if ( NO_ERROR != ret ) {

                CAMHAL_LOGEB("setCaptureMode() failed %d", ret);

            }

        }

        if(mCapMode == OMXCameraAdapter::VIDEO_MODE) {

            if (mPendingPreviewSettings & SetVNF) {

                mPendingPreviewSettings &= ~SetVNF;

                ret = enableVideoNoiseFilter(mVnfEnabled);

                if ( NO_ERROR != ret){

                    CAMHAL_LOGEB("Error configuring VNF %x", ret);

                }

            }

            if (mPendingPreviewSettings & SetVSTAB) {

                mPendingPreviewSettings &= ~SetVSTAB;

                ret = enableVideoStabilization(mVstabEnabled);

                if ( NO_ERROR != ret) {

                    CAMHAL_LOGEB("Error configuring VSTAB %x", ret);

                }

            }

        }

    }

    ret = setSensorOrientation(mSensorOrientation);

    if ( NO_ERROR != ret )

        {

        CAMHAL_LOGEB("Error configuring Sensor Orientation %x", ret);

        mSensorOrientation = 0;

        }

    if ( mComponentState == OMX_StateLoaded )

        {

        ///Register for IDLE state switch event

        ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

                               OMX_EventCmdComplete,

                               OMX_CommandStateSet,

                               OMX_StateIdle,

                               mUsePreviewSem);

        if(ret!=NO_ERROR)

            {

            CAMHAL_LOGEB("Error in registering for event %d", ret);

            goto EXIT;

            }

        ///Once we get the buffers, move component state to idle state and pass the buffers to OMX comp using UseBuffer

        eError = OMX_SendCommand (mCameraAdapterParameters.mHandleComp ,

                                  OMX_CommandStateSet,

                                  OMX_StateIdle,

                                  NULL);

        CAMHAL_LOGDB("OMX_SendCommand(OMX_CommandStateSet) 0x%x", eError);

        GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

        mComponentState = OMX_StateIdle;

        }

    else

        {

            ///Register for Preview port ENABLE event

            ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

                                   OMX_EventCmdComplete,

                                   OMX_CommandPortEnable,

                                   mCameraAdapterParameters.mPrevPortIndex,

                                   mUsePreviewSem);

            if ( NO_ERROR != ret )

                {

                CAMHAL_LOGEB("Error in registering for event %d", ret);

                goto EXIT;

                }

            ///Enable Preview Port

            eError = OMX_SendCommand(mCameraAdapterParameters.mHandleComp,

                                     OMX_CommandPortEnable,

                                     mCameraAdapterParameters.mPrevPortIndex,

                                     NULL);

        }

    ///Configure DOMX to use either gralloc handles or vptrs

    OMX_TI_PARAMUSENATIVEBUFFER domxUseGrallocHandles;

    OMX_INIT_STRUCT_PTR (&domxUseGrallocHandles, OMX_TI_PARAMUSENATIVEBUFFER);

    domxUseGrallocHandles.nPortIndex = mCameraAdapterParameters.mPrevPortIndex;

    domxUseGrallocHandles.bEnable = OMX_TRUE;

    eError = OMX_SetParameter(mCameraAdapterParameters.mHandleComp,

                            (OMX_INDEXTYPE)OMX_TI_IndexUseNativeBuffers, &domxUseGrallocHandles);

    if(eError!=OMX_ErrorNone)

        {

        CAMHAL_LOGEB("OMX_SetParameter - %x", eError);

        }

    GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

    OMX_BUFFERHEADERTYPE *pBufferHdr;

    for(int index=0;index<num;index++) {

        OMX_U8 *ptr;

        ptr = (OMX_U8 *)camera_buffer_get_omx_ptr (&bufArr[index]);

        eError = OMX_UseBuffer( mCameraAdapterParameters.mHandleComp,

                                &pBufferHdr,

                                mCameraAdapterParameters.mPrevPortIndex,

                                0,

                                mPreviewData->mBufSize,

                                ptr);

        if(eError!=OMX_ErrorNone)

            {

            CAMHAL_LOGEB("OMX_UseBuffer-0x%x", eError);

            }

        GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

        pBufferHdr->pAppPrivate = (OMX_PTR)&bufArr[index];

        pBufferHdr->nSize = sizeof(OMX_BUFFERHEADERTYPE);

        pBufferHdr->nVersion.s.nVersionMajor = 1 ;

        pBufferHdr->nVersion.s.nVersionMinor = 1 ;

        pBufferHdr->nVersion.s.nRevision = 0 ;

        pBufferHdr->nVersion.s.nStep =  0;

        mPreviewData->mBufferHeader[index] = pBufferHdr;

    }

    if ( mMeasurementEnabled )

        {

        for( int i = 0; i < num; i++ )

            {

            OMX_BUFFERHEADERTYPE *pBufHdr;

            OMX_U8 *ptr;

            ptr = (OMX_U8 *)camera_buffer_get_omx_ptr (&mPreviewDataBuffers[i]);

            eError = OMX_UseBuffer( mCameraAdapterParameters.mHandleComp,

                                    &pBufHdr,

                                    mCameraAdapterParameters.mMeasurementPortIndex,

                                    0,

                                    measurementData->mBufSize,

                                    ptr);

             if ( eError == OMX_ErrorNone )

                {

                pBufHdr->pAppPrivate = (OMX_PTR *)&mPreviewDataBuffers[i];

                pBufHdr->nSize = sizeof(OMX_BUFFERHEADERTYPE);

                pBufHdr->nVersion.s.nVersionMajor = 1 ;

                pBufHdr->nVersion.s.nVersionMinor = 1 ;

                pBufHdr->nVersion.s.nRevision = 0 ;

                pBufHdr->nVersion.s.nStep =  0;

                measurementData->mBufferHeader[i] = pBufHdr;

                }

            else

                {

                CAMHAL_LOGEB("OMX_UseBuffer -0x%x", eError);

                ret = BAD_VALUE;

                break;

                }

            }

        }

    CAMHAL_LOGDA("Registering preview buffers");

    ret = mUsePreviewSem.WaitTimeout(OMX_CMD_TIMEOUT);

    //If somethiing bad happened while we wait

    if (mComponentState == OMX_StateInvalid)

      {

        CAMHAL_LOGEA("Invalid State after Registering preview buffers Exitting!!!");

        goto EXIT;

      }

    if ( NO_ERROR == ret )

        {

        CAMHAL_LOGDA("Preview buffer registration successfull");

        }

    else

        {

        if ( mComponentState == OMX_StateLoaded )

            {

            ret |= RemoveEvent(mCameraAdapterParameters.mHandleComp,

                               OMX_EventCmdComplete,

                               OMX_CommandStateSet,

                               OMX_StateIdle,

                               NULL);

            }

        else

            {

            ret |= SignalEvent(mCameraAdapterParameters.mHandleComp,

                               OMX_EventCmdComplete,

                               OMX_CommandPortEnable,

                               mCameraAdapterParameters.mPrevPortIndex,

                               NULL);

            }

        CAMHAL_LOGEA("Timeout expired on preview buffer registration");

        goto EXIT;

        }

    LOG_FUNCTION_NAME_EXIT;

    return (ret | ErrorUtils::omxToAndroidError(eError));

    ///If there is any failure, we reach here.

    ///Here, we do any resource freeing and convert from OMX error code to Camera Hal error code

EXIT:

    mStateSwitchLock.unlock();

    CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

    performCleanupAfterError();

    CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

    LOG_FUNCTION_NAME_EXIT;

    return (ret | ErrorUtils::omxToAndroidError(eError));

}

最后这里还是很有必要看一下调用的startPreviewCallback方法

status_t AppCallbackNotifier::startPreviewCallbacks(CameraParameters &params, CameraBuffer *buffers, uint32_t *offsets, int fd, size_t length, size_t count)

{

    sp<MemoryHeapBase> heap;

    sp<MemoryBase> buffer;

    unsigned int *bufArr;

    int size = 0;

    LOG_FUNCTION_NAME;

    Mutex::Autolock lock(mLock);

    if ( NULL == mFrameProvider )

        {

        CAMHAL_LOGEA("Trying to start video recording without FrameProvider");

        return -EINVAL;

        }

    if ( mPreviewing )

        {

        CAMHAL_LOGDA("+Already previewing");

        return NO_INIT;

        }

    int w,h;

    ///Get preview size

    params.getPreviewSize(&w, &h);

    // save preview pixel format, size and stride

    mPreviewWidth = w;

    mPreviewHeight = h;

    mPreviewStride = 4096;

    mPreviewPixelFormat = getContstantForPixelFormat(params.getPreviewFormat());

    size = calculateBufferSize(w, h, mPreviewPixelFormat);

    这里根据传入的尺寸信息申请memory

    mPreviewMemory = mRequestMemory(-1, size, AppCallbackNotifier::MAX_BUFFERS, NULL);

    if (!mPreviewMemory) {

        return NO_MEMORY;

    }

    for (int i=0; i < AppCallbackNotifier::MAX_BUFFERS; i++) {

        mPreviewBuffers[i].type = CAMERA_BUFFER_MEMORY;

        mPreviewBuffers[i].opaque = (unsigned char*) mPreviewMemory->data + (i*size);

        mPreviewBuffers[i].mapped = mPreviewBuffers[i].opaque;

    }

    if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME ) ) {

         mFrameProvider->enableFrameNotification(CameraFrame::PREVIEW_FRAME_SYNC);

    }

    if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_POSTVIEW_FRAME) ) {

         mFrameProvider->enableFrameNotification(CameraFrame::SNAPSHOT_FRAME);

    }

    mPreviewBufCount = 0;

    mPreviewing = true;

    LOG_FUNCTION_NAME_EXIT;

    return NO_ERROR;

}

到这里startPreview的初始化过程就结束了,下面咱们就进到底层看看OMXCameraAdapter是怎样实现开始preview

status_t OMXCameraAdapter::startPreview()

{

    status_t ret = NO_ERROR;

    OMX_ERRORTYPE eError = OMX_ErrorNone;

    OMXCameraPortParameters *mPreviewData = NULL;

    OMXCameraPortParameters *measurementData = NULL;

    LOG_FUNCTION_NAME;

    if( 0 != mStartPreviewSem.Count() )

        {

        CAMHAL_LOGEB("Error mStartPreviewSem semaphore count %d", mStartPreviewSem.Count());

        ret = NO_INIT;

        goto EXIT;

        }

    // Enable all preview mode extra data.

    if ( OMX_ErrorNone == eError) {

        ret |= setExtraData(true, mCameraAdapterParameters.mPrevPortIndex, OMX_AncillaryData);

        ret |= setExtraData(true, OMX_ALL, OMX_TI_VectShotInfo);

    }

    mPreviewData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mPrevPortIndex];

    measurementData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mMeasurementPortIndex];

    if( OMX_StateIdle == mComponentState )

        {

        ///Register for EXECUTING state transition.

        ///This method just inserts a message in Event Q, which is checked in the callback

        ///The sempahore passed is signalled by the callback

        ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

                               OMX_EventCmdComplete,

                               OMX_CommandStateSet,

                               OMX_StateExecuting,

                               mStartPreviewSem);

        if(ret!=NO_ERROR)

            {

            CAMHAL_LOGEB("Error in registering for event %d", ret);

            goto EXIT;

            }

        ///Switch to EXECUTING state

        eError = OMX_SendCommand(mCameraAdapterParameters.mHandleComp,

                                 OMX_CommandStateSet,

                                 OMX_StateExecuting,

                                 NULL);

        if(eError!=OMX_ErrorNone)

            {

            CAMHAL_LOGEB("OMX_SendCommand(OMX_StateExecuting)-0x%x", eError);

            }

        CAMHAL_LOGDA("+Waiting for component to go into EXECUTING state");

        ret = mStartPreviewSem.WaitTimeout(OMX_CMD_TIMEOUT);

        //If somethiing bad happened while we wait

        if (mComponentState == OMX_StateInvalid)

          {

            CAMHAL_LOGEA("Invalid State after IDLE_EXECUTING Exitting!!!");

            goto EXIT;

          }

        if ( NO_ERROR == ret )

            {

            CAMHAL_LOGDA("+Great. Component went into executing state!!");

            }

        else

            {

            ret |= RemoveEvent(mCameraAdapterParameters.mHandleComp,

                               OMX_EventCmdComplete,

                               OMX_CommandStateSet,

                               OMX_StateExecuting,

                               NULL);

            CAMHAL_LOGDA("Timeout expired on executing state switch!");

            goto EXIT;

            }

        mComponentState = OMX_StateExecuting;

        }

    mStateSwitchLock.unlock();

    //Queue all the buffers on preview port

    for(int index=0;index< mPreviewData->mMaxQueueable;index++)

        {

        CAMHAL_LOGDB("Queuing buffer on Preview port - 0x%x", (uint32_t)mPreviewData->mBufferHeader[index]->pBuffer);

        mPreviewData->mStatus[index] = OMXCameraPortParameters::FILL;

        eError = OMX_FillThisBuffer(mCameraAdapterParameters.mHandleComp,

                    (OMX_BUFFERHEADERTYPE*)mPreviewData->mBufferHeader[index]);

        if(eError!=OMX_ErrorNone)

            {

            CAMHAL_LOGEB("OMX_FillThisBuffer-0x%x", eError);

            }

        mFramesWithDucati++;

#ifdef CAMERAHAL_DEBUG

        mBuffersWithDucati.add((int)mPreviewData->mBufferHeader[index]->pAppPrivate,1);

#endif

        GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

        }

    if ( mMeasurementEnabled )

        {

        for(int index=0;index< mPreviewData->mNumBufs;index++)

            {

            CAMHAL_LOGDB("Queuing buffer on Measurement port - 0x%x", (uint32_t) measurementData->mBufferHeader[index]->pBuffer);

            measurementData->mStatus[index] = OMXCameraPortParameters::FILL;

            eError = OMX_FillThisBuffer(mCameraAdapterParameters.mHandleComp,

                            (OMX_BUFFERHEADERTYPE*) measurementData->mBufferHeader[index]);

            if(eError!=OMX_ErrorNone)

                {

                CAMHAL_LOGEB("OMX_FillThisBuffer-0x%x", eError);

                }

            GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

            }

        }

    setFocusCallback(true);

    //reset frame rate estimates

    mFPS = 0.0f;

    mLastFPS = 0.0f;

    // start frame count from 0. i.e first frame after

    // startPreview will be the 0th reference frame

    // this way we will wait for second frame until

    // takePicture/autoFocus is allowed to run. we

    // are seeing SetConfig/GetConfig fail after

    // calling after the first frame and not failing

    // after the second frame

    mFrameCount = -1;

    mLastFrameCount = 0;

    mIter = 1;

    mLastFPSTime = systemTime();

    mTunnelDestroyed = false;

    LOG_FUNCTION_NAME_EXIT;

    return (ret | ErrorUtils::omxToAndroidError(eError));

    EXIT:

    CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

    performCleanupAfterError();

    mStateSwitchLock.unlock();

    LOG_FUNCTION_NAME_EXIT;

    return (ret | ErrorUtils::omxToAndroidError(eError));

}

其实底层的这个方法实现的方法比我想象的要简单,这里不做过多的说明,上面只是首先保证组件处于Excting state,然后调用OMX_FillBuffer方法把所有的buffer 加入到组件中接收编码好的bufferfillBuffer完成后,会通过FillBufferDone这个回调,通知应用层,包括把buffer给到应用从,这里不做过多说明,直接看看fillBufferDone是怎么处理的吧

/*========================================================*/

/* @ fn SampleTest_FillBufferDone ::  Application callback*/

/*========================================================*/

OMX_ERRORTYPE OMXCameraAdapterFillBufferDone(OMX_IN OMX_HANDLETYPE hComponent,

                                   OMX_IN OMX_PTR pAppData,

                                   OMX_IN OMX_BUFFERHEADERTYPE* pBuffHeader)

{

    TIUTILS::Message msg;

    OMX_ERRORTYPE eError = OMX_ErrorNone;

    if (UNLIKELY(mDebugFps)) {

        debugShowFPS();

    }

    OMXCameraAdapter *adapter =  ( OMXCameraAdapter * ) pAppData;

    if ( NULL != adapter )

        {

        msg.command = OMXCameraAdapter::OMXCallbackHandler::CAMERA_FILL_BUFFER_DONE;

        msg.arg1 = ( void * ) hComponent;

        msg.arg2 = ( void * ) pBuffHeader;

        adapter->mOMXCallbackHandler->put(&msg);

        }

    return eError;

}

这里只是打包消息,并发送消息最终是由OMXCallbackHandler中的handle去处理这个消息的

bool OMXCameraAdapter::OMXCallbackHandler::Handler()

{

    TIUTILS::Message msg;

    volatile int forever = 1;

    status_t ret = NO_ERROR;

    LOG_FUNCTION_NAME;

    while(forever){

        TIUTILS::MessageQueue::waitForMsg(&mCommandMsgQ, NULL, NULL, -1);

        {检查到消息,接着往下走

        Mutex::Autolock lock(mLock);

        mCommandMsgQ.get(&msg);

        mIsProcessed = false;

        }

        switch ( msg.command ) {

            case OMXCallbackHandler::CAMERA_FILL_BUFFER_DONE:

            {

                ret = mCameraAdapter->OMXCameraAdapterFillBufferDone(( OMX_HANDLETYPE ) msg.arg1,

                                                                     ( OMX_BUFFERHEADERTYPE *) msg.arg2);

                break;

            }

            case OMXCallbackHandler::CAMERA_FOCUS_STATUS:

            {

                mCameraAdapter->handleFocusCallback();

                break;

            }

            case CommandHandler::COMMAND_EXIT:

            {

                CAMHAL_LOGDA("Exiting OMX callback handler");

                forever = 0;

                break;

            }

        }

        {

            android::AutoMutex locker(mLock);

            CAMHAL_UNUSED(locker);

            mIsProcessed = mCommandMsgQ.isEmpty();

            if ( mIsProcessed )

                mCondition.signal();

        }

    }

    // force the condition to wake

    {

        android::AutoMutex locker(mLock);

        CAMHAL_UNUSED(locker);

        mIsProcessed = true;

        mCondition.signal();

    }

    LOG_FUNCTION_NAME_EXIT;

    return false;

}

检查到fillBufferDone消息,调用OMXCameraAdapter下的fillBufferDone处理方法

#endif

/*========================================================*/

/* @ fn SampleTest_FillBufferDone ::  Application callback*/

/*========================================================*/

OMX_ERRORTYPE OMXCameraAdapter::OMXCameraAdapterFillBufferDone(OMX_IN OMX_HANDLETYPE hComponent,

                                   OMX_IN OMX_BUFFERHEADERTYPE* pBuffHeader)

{

    status_t  stat = NO_ERROR;

    status_t  res1, res2;

    OMXCameraPortParameters  *pPortParam;

    OMX_ERRORTYPE eError = OMX_ErrorNone;

    CameraFrame::FrameType typeOfFrame = CameraFrame::ALL_FRAMES;

    unsigned int refCount = 0;

    BaseCameraAdapter::AdapterState state, nextState;

    BaseCameraAdapter::getState(state);

    BaseCameraAdapter::getNextState(nextState);

    sp<CameraMetadataResult> metadataResult = NULL;

    unsigned int mask = 0xFFFF;

    CameraFrame cameraFrame;

    OMX_OTHER_EXTRADATATYPE *extraData;

    OMX_TI_ANCILLARYDATATYPE *ancillaryData = NULL;

    bool snapshotFrame = false;

    if ( NULL == pBuffHeader ) {

        return OMX_ErrorBadParameter;

    }

#ifdef CAMERAHAL_OMX_PROFILING

    storeProfilingData(pBuffHeader);

#endif

    res1 = res2 = NO_ERROR;

    if ( !pBuffHeader || !pBuffHeader->pBuffer ) {

        CAMHAL_LOGEA("NULL Buffer from OMX");

        return OMX_ErrorNone;

    }

    pPortParam = &(mCameraAdapterParameters.mCameraPortParams[pBuffHeader->nOutputPortIndex]);

    // Find buffer and mark it as filled

    for (int i = 0; i < pPortParam->mNumBufs; i++) {

        if (pPortParam->mBufferHeader[i] == pBuffHeader) {

            pPortParam->mStatus[i] = OMXCameraPortParameters::DONE;

        }

    }

    if (pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_PREVIEW)

        {

        if ( ( PREVIEW_ACTIVE & state ) != PREVIEW_ACTIVE )

            {

            return OMX_ErrorNone;

            }

        if ( mWaitingForSnapshot ) {

            extraData = getExtradata(pBuffHeader->pPlatformPrivate,

                                     (OMX_EXTRADATATYPE) OMX_AncillaryData);

            if ( NULL != extraData ) {

                ancillaryData = (OMX_TI_ANCILLARYDATATYPE*) extraData->data;

                if ((OMX_2D_Snap == ancillaryData->eCameraView)

                    || (OMX_3D_Left_Snap == ancillaryData->eCameraView)

                    || (OMX_3D_Right_Snap == ancillaryData->eCameraView)) {

                    snapshotFrame = OMX_TRUE;

                } else {

                    snapshotFrame = OMX_FALSE;

                }

                mPending3Asettings |= SetFocus;

            }

        }

        ///Prepare the frames to be sent - initialize CameraFrame object and reference count

        // TODO(XXX): ancillary data for snapshot frame is not being sent for video snapshot

        //            if we are waiting for a snapshot and in video mode...go ahead and send

        //            this frame as a snapshot

        if( mWaitingForSnapshot &&  (mCapturedFrames > 0) &&

            (snapshotFrame || (mCapMode == VIDEO_MODE)))

            {

            typeOfFrame = CameraFrame::SNAPSHOT_FRAME;

            mask = (unsigned int)CameraFrame::SNAPSHOT_FRAME;

            // video snapshot gets ancillary data and wb info from last snapshot frame

            mCaptureAncillaryData = ancillaryData;

            mWhiteBalanceData = NULL;

            extraData = getExtradata(pBuffHeader->pPlatformPrivate,

                                     (OMX_EXTRADATATYPE) OMX_WhiteBalance);

            if ( NULL != extraData )

                {

                mWhiteBalanceData = (OMX_TI_WHITEBALANCERESULTTYPE*) extraData->data;

                }

            }

        else

            {

            typeOfFrame = CameraFrame::PREVIEW_FRAME_SYNC;

            mask = (unsigned int)CameraFrame::PREVIEW_FRAME_SYNC;

            }

        if (mRecording)

            {

            mask |= (unsigned int)CameraFrame::VIDEO_FRAME_SYNC;

            mFramesWithEncoder++;

            }

        //LOGV("FBD pBuffer = 0x%x", pBuffHeader->pBuffer);

        if( mWaitingForSnapshot )

          {

            if (!mBracketingEnabled &&

                 ((HIGH_SPEED == mCapMode) || (VIDEO_MODE == mCapMode)) )

              {

                notifyShutterSubscribers();

              }

          }

        stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

        mFramesWithDisplay++;

        mFramesWithDucati--;

#ifdef CAMERAHAL_DEBUG

        if(mBuffersWithDucati.indexOfKey((uint32_t)pBuffHeader->pBuffer)<0)

            {

            LOGE("Buffer was never with Ducati!! %p", pBuffHeader->pBuffer);

            for(unsigned int i=0;i<mBuffersWithDucati.size();i++) LOGE("0x%x", mBuffersWithDucati.keyAt(i));

            }

        mBuffersWithDucati.removeItem((int)pBuffHeader->pBuffer);

#endif

        if(mDebugFcs)

            CAMHAL_LOGEB("C[%d] D[%d] E[%d]", mFramesWithDucati, mFramesWithDisplay, mFramesWithEncoder);

        recalculateFPS();

        createPreviewMetadata(pBuffHeader, metadataResult, pPortParam->mWidth, pPortParam->mHeight);

        if ( NULL != metadataResult.get() ) {

            notifyMetadataSubscribers(metadataResult);

            metadataResult.clear();

        }

        {

            Mutex::Autolock lock(mFaceDetectionLock);

            if ( mFDSwitchAlgoPriority ) {

                 //Disable region priority and enable face priority for AF

                 setAlgoPriority(REGION_PRIORITY, FOCUS_ALGO, false);

                 setAlgoPriority(FACE_PRIORITY, FOCUS_ALGO , true);

                 //Disable Region priority and enable Face priority

                 setAlgoPriority(REGION_PRIORITY, EXPOSURE_ALGO, false);

                 setAlgoPriority(FACE_PRIORITY, EXPOSURE_ALGO, true);

                 mFDSwitchAlgoPriority = false;

            }

        }

        sniffDccFileDataSave(pBuffHeader);

        stat |= advanceZoom();

        // On the fly update to 3A settings not working

        // Do not update 3A here if we are in the middle of a capture

        // or in the middle of transitioning to it

        if( mPending3Asettings &&

                ( (nextState & CAPTURE_ACTIVE) == 0 ) &&

                ( (state & CAPTURE_ACTIVE) == 0 ) ) {

            apply3Asettings(mParameters3A);

        }

        }

    else if( pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_MEASUREMENT )

        {

        typeOfFrame = CameraFrame::FRAME_DATA_SYNC;

        mask = (unsigned int)CameraFrame::FRAME_DATA_SYNC;

        stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

       }

    else if( pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_IMAGE_OUT_IMAGE )

    {

        OMX_COLOR_FORMATTYPE pixFormat;

        const char *valstr = NULL;

        pixFormat = pPortParam->mColorFormat;

        if ( OMX_COLOR_FormatUnused == pixFormat )

            {

            typeOfFrame = CameraFrame::IMAGE_FRAME;

            mask = (unsigned int) CameraFrame::IMAGE_FRAME;

        } else if ( pixFormat == OMX_COLOR_FormatCbYCrY &&

                  ((mPictureFormatFromClient &&

                          !strcmp(mPictureFormatFromClient,

                                  CameraParameters::PIXEL_FORMAT_JPEG)) ||

                   !mPictureFormatFromClient) ) {

            // signals to callbacks that this needs to be coverted to jpeg

            // before returning to framework

            typeOfFrame = CameraFrame::IMAGE_FRAME;

            mask = (unsigned int) CameraFrame::IMAGE_FRAME;

            cameraFrame.mQuirks |= CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG;

            cameraFrame.mQuirks |= CameraFrame::FORMAT_YUV422I_UYVY;

            // populate exif data and pass to subscribers via quirk

            // subscriber is in charge of freeing exif data

            ExifElementsTable* exif = new ExifElementsTable();

            setupEXIF_libjpeg(exif, mCaptureAncillaryData, mWhiteBalanceData);

            cameraFrame.mQuirks |= CameraFrame::HAS_EXIF_DATA;

            cameraFrame.mCookie2 = (void*) exif;

        } else {

            typeOfFrame = CameraFrame::RAW_FRAME;

            mask = (unsigned int) CameraFrame::RAW_FRAME;

        }

            pPortParam->mImageType = typeOfFrame;

            if((mCapturedFrames>0) && !mCaptureSignalled)

                {

                mCaptureSignalled = true;

                mCaptureSem.Signal();

                }

            if( ( CAPTURE_ACTIVE & state ) != CAPTURE_ACTIVE )

                {

                goto EXIT;

                }

            {

            Mutex::Autolock lock(mBracketingLock);

            if ( mBracketingEnabled )

                {

                doBracketing(pBuffHeader, typeOfFrame);

                return eError;

                }

            }

            if (mZoomBracketingEnabled) {

                doZoom(mZoomBracketingValues[mCurrentZoomBracketing]);

                CAMHAL_LOGDB("Current Zoom Bracketing: %d", mZoomBracketingValues[mCurrentZoomBracketing]);

                mCurrentZoomBracketing++;

                if (mCurrentZoomBracketing == ARRAY_SIZE(mZoomBracketingValues)) {

                    mZoomBracketingEnabled = false;

                }

            }

        if ( 1 > mCapturedFrames )

            {

            goto EXIT;

            }

#ifdef OMAP_ENHANCEMENT_CPCAM

        setMetaData(cameraFrame.mMetaData, pBuffHeader->pPlatformPrivate);

#endif

        CAMHAL_LOGDB("Captured Frames: %d", mCapturedFrames);

        mCapturedFrames--;

#ifdef CAMERAHAL_USE_RAW_IMAGE_SAVING

        if (mYuvCapture) {

            struct timeval timeStampUsec;

            gettimeofday(&timeStampUsec, NULL);

            time_t saveTime;

            time(&saveTime);

            const struct tm * const timeStamp = gmtime(&saveTime);

            char filename[256];

            snprintf(filename,256, "%s/yuv_%d_%d_%d_%lu.yuv",

                    kYuvImagesOutputDirPath,

                    timeStamp->tm_hour,

                    timeStamp->tm_min,

                    timeStamp->tm_sec,

                    timeStampUsec.tv_usec);

            const status_t saveBufferStatus = saveBufferToFile(((CameraBuffer*)pBuffHeader->pAppPrivate)->mapped,

                                               pBuffHeader->nFilledLen, filename);

            if (saveBufferStatus != OK) {

                CAMHAL_LOGE("ERROR: %d, while saving yuv!", saveBufferStatus);

            } else {

                CAMHAL_LOGD("yuv_%d_%d_%d_%lu.yuv successfully saved in %s",

                        timeStamp->tm_hour,

                        timeStamp->tm_min,

                        timeStamp->tm_sec,

                        timeStampUsec.tv_usec,

                        kYuvImagesOutputDirPath);

            }

        }

#endif

        stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

        }

        else if (pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_VIDEO) {

            typeOfFrame = CameraFrame::RAW_FRAME;

            pPortParam->mImageType = typeOfFrame;

            {

                Mutex::Autolock lock(mLock);

                if( ( CAPTURE_ACTIVE & state ) != CAPTURE_ACTIVE ) {

                    goto EXIT;

                }

            }

            CAMHAL_LOGD("RAW buffer done on video port, length = %d", pBuffHeader->nFilledLen);

            mask = (unsigned int) CameraFrame::RAW_FRAME;

#ifdef CAMERAHAL_USE_RAW_IMAGE_SAVING

            if ( mRawCapture ) {

                struct timeval timeStampUsec;

                gettimeofday(&timeStampUsec, NULL);

                time_t saveTime;

                time(&saveTime);

                const struct tm * const timeStamp = gmtime(&saveTime);

                char filename[256];

                snprintf(filename,256, "%s/raw_%d_%d_%d_%lu.raw",

                         kRawImagesOutputDirPath,

                         timeStamp->tm_hour,

                         timeStamp->tm_min,

                         timeStamp->tm_sec,

                         timeStampUsec.tv_usec);

                const status_t saveBufferStatus = saveBufferToFile( ((CameraBuffer*)pBuffHeader->pAppPrivate)->mapped,

                                                   pBuffHeader->nFilledLen, filename);

                if (saveBufferStatus != OK) {

                    CAMHAL_LOGE("ERROR: %d , while saving raw!", saveBufferStatus);

                } else {

                    CAMHAL_LOGD("raw_%d_%d_%d_%lu.raw successfully saved in %s",

                                timeStamp->tm_hour,

                                timeStamp->tm_min,

                                timeStamp->tm_sec,

                                timeStampUsec.tv_usec,

                                kRawImagesOutputDirPath);

                    stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

                }

            }

#endif

        } else {

            CAMHAL_LOGEA("Frame received for non-(preview/capture/measure) port. This is yet to be supported");

            goto EXIT;

        }

    if ( NO_ERROR != stat )这里上面的所有处理过程都成功了才会执行这里

        {

        CameraBuffer *camera_buffer;

        camera_buffer = (CameraBuffer *)pBuffHeader->pAppPrivate;

        CAMHAL_LOGDB("sendFrameToSubscribers error: %d", stat);

        returnFrame(camera_buffer, typeOfFrame);

        }

    return eError;

    EXIT:

    CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, stat, eError);

    if ( NO_ERROR != stat )

        {

        if ( NULL != mErrorNotifier )

            {

            mErrorNotifier->errorNotify(CAMERA_ERROR_UNKNOWN);

            }

        }

    return eError;

}

这里具体的数据回传给app暂时不做过多说明,看看这个方法的最后处理,这里才是我真正关注的

那么这个returnFrame到底实现什么功能呢

void BaseCameraAdapter::returnFrame(CameraBuffer * frameBuf, CameraFrame::FrameType frameType)

{

    status_t res = NO_ERROR;

    size_t subscriberCount = 0;

    int refCount = -1;

    if ( NULL == frameBuf )

        {

        CAMHAL_LOGEA("Invalid frameBuf");

        return;

        }

    if ( NO_ERROR == res)

        {

        Mutex::Autolock lock(mReturnFrameLock);

        refCount = getFrameRefCount(frameBuf,  frameType);

        if(frameType == CameraFrame::PREVIEW_FRAME_SYNC)

            {

            mFramesWithDisplay--;

            }

        else if(frameType == CameraFrame::VIDEO_FRAME_SYNC)

            {

            mFramesWithEncoder--;

            }

        if ( 0 < refCount )

            {

            refCount--;

            setFrameRefCount(frameBuf, frameType, refCount);

            if ( mRecording && (CameraFrame::VIDEO_FRAME_SYNC == frameType) ) {

                refCount += getFrameRefCount(frameBuf, CameraFrame::PREVIEW_FRAME_SYNC);

            } else if ( mRecording && (CameraFrame::PREVIEW_FRAME_SYNC == frameType) ) {

                refCount += getFrameRefCount(frameBuf, CameraFrame::VIDEO_FRAME_SYNC);

            } else if ( mRecording && (CameraFrame::SNAPSHOT_FRAME == frameType) ) {

                refCount += getFrameRefCount(frameBuf, CameraFrame::VIDEO_FRAME_SYNC);

            }

            }

        else

            {

            CAMHAL_LOGDA("Frame returned when ref count is already zero!!");

            return;

            }

        }

    CAMHAL_LOGVB("REFCOUNT 0x%x %d", frameBuf, refCount);

    if ( NO_ERROR == res )

        {

        //check if someone is holding this buffer

        if ( 0 == refCount )

            {

#ifdef CAMERAHAL_DEBUG

            if((mBuffersWithDucati.indexOfKey((int)camera_buffer_get_omx_ptr(frameBuf)) >= 0) &&

               ((CameraFrame::PREVIEW_FRAME_SYNC == frameType) ||

                 (CameraFrame::SNAPSHOT_FRAME == frameType)))

                {

                LOGE("Buffer already with Ducati!! 0x%x", frameBuf);

                for(int i=0;i<mBuffersWithDucati.size();i++) LOGE("0x%x", mBuffersWithDucati.keyAt(i));

                }

            mBuffersWithDucati.add((int)camera_buffer_get_omx_ptr(frameBuf),1);

#endif

            res = fillThisBuffer(frameBuf, frameType);

            }

        }

                     

}

最后我标注的地方,这个buffer用完了,就是通过这里重新fill到组件中,重复利用的,当然这是我个人的理解

先到这里了,以后再详细说了

待续。。。。。。


阅读(1549) | 评论(0) | 转发(0) |
给主人留下些什么吧!~~