Chinaunix首页 | 论坛 | 博客
  • 博客访问: 353192
  • 博文数量: 197
  • 博客积分: 0
  • 博客等级: 民兵
  • 技术积分: 303
  • 用 户 组: 普通用户
  • 注册时间: 2013-09-02 14:21
文章分类

全部博文(197)

文章存档

2014年(89)

2013年(108)

我的朋友

分类: Android平台

2014-03-20 17:15:46

Android Camera数据流分析全程记录

花了不少时间在这个数据流的分析上面,自己毕竟没怎么做过android,这里记录一下自己的见解,任何理解错误还望高人指教,以后还需慢慢纠正
整个分析过程从app的onCreate开始:packages/apps/OMAPCamera/src/com/ti/omap4/android/camera/Camera.java
在onCreate中做了很多的初始化,我们真正关注的是一下几条语句:
  1. // don't set mSurfaceHolder here. We have it set ONLY within
  2. // surfaceChanged / surfaceDestroyed, other parts of the code
  3. // assume that when it is set, the surface is also set.
  4. SurfaceView preview = (SurfaceView) findViewById(R.id.camera_preview);
  5. SurfaceHolder holder = preview.getHolder();
  6. holder.addCallback(this);
在这里我们实例化了一个SurfaceView对象,通过这个对象获取SurfaceHolder对象,实现这个addCallback方法,
其中SurfaceView的定义在以下路径:frameworks/base/core/java/android/view/SurfaceView.java
其中SurfaceHolder的定义在以下路径:frameworks/base/core/java/android/view/SurfaceHolder.java

这里看看这个文章的解释,写的很是不错:http://blog.chinaunix.net/uid-9863638-id-1996383.html

SurfaceFlinger 是Android multimedia 的一个部分,在Android 的实现中它是一个service ,提供系统范围内的surface composer 功能,它能够将各种应用程序的2D,3D surface 进行组合。
在具体讲SurfaceFlinger 之前,我们先来看一下有关显示方面的一些基础知识 。

每个应用程序可能对应着一个或者多个图形界面,而每个界面我们就称之为一个surface ,或者说是window ,在上面的图中我们能看到4 个surface ,一个是home 界面,还有就是红、绿、蓝分别代表的3 个surface ,而两个button 实际是home surface 里面的内容。在这里我们能看到我们进行图形显示所需要解决 的问题:
a 、首先每个surface 在屏幕上有它的位置,以及大小,然后每个surface 里面还有要显示的内容,内容,大小,位置 这些元素 在我们改变应用程序的时候都可能会改变,改变时应该如何处理 
b 、然后就各个surface 之间可能有重叠,比如说在上面的简略图中,绿色覆盖了蓝色,而红色又覆盖了绿色和蓝色以及下面的home ,而且还具有一定透明度。这种层之间的关系应该如何描述      
我们首先来看第二个问题,我们可以想象在屏幕平面的垂直方向还有一个Z 轴,所有的surface 根据在Z 轴上的坐标来确定前后,这样就可以描述各个surface 之间的上下覆盖关系了,而这个在Z 轴上的顺序,图形上有个专业术语叫Z-order 。  
对于第一个问题,我们需要一个结构来记录应用程序界面的位置,大小,以及一个buffer 来记录需要显示的内容,所以这就是我们surface 的概念,surface 实际我们可以把它理解成一个容器,这个容器记录着应用程序界面的控制信息,比如说大小啊,位置啊,而它还有buffer 来专门存储需要显示的内容。
在这里还存在一个问题,那就是当存在图形重合的时候应该如何处理呢,而且可能有些surface 还带有透明信息,这里就是我们SurfaceFlinger 需要解决问题,它要把各个surface 组合(compose/merge) 成一个main Surface ,最后将Main Surface 的内容发送给FB/V4l2 Output ,这样屏幕上就能看到我们想要的效果。
在实际中对这些Surface 进行merge 可以采用两种方式,一种就是采用软件的形式来merge ,还一种就是采用硬件的方式,软件的方式就是我们的SurfaceFlinger ,而硬件的方式就是Overlay 。

首先继承SurfaceView并实现SurfaceHolder.Callback接口
使用接口的原因:因为使用SurfaceView 有一个原则,所有的绘图工作必须得在Surface 被创建之后才能开始(Surface—表面,基本上我们可以把它当作显存的一个映射,写入到Surface 的内容可以被直接复制到显存从而显示出来,这使得显示速度会非常快),而在Surface 被销毁之前必须结束。所以Callback 中的surfaceCreated 和surfaceDestroyed 就成了绘图处理代码的边界。
需要重写的方法
 (1)public void surfaceChanged(SurfaceHolder holder,int format,int width,int height){}//在surface的大小发生改变时激发
 (2)public void surfaceCreated(SurfaceHolder holder){}//在创建时激发,一般在这里调用画图的线程。
 (3)public void surfaceDestroyed(SurfaceHolder holder) {} //销毁时激发,一般在这里将画图的线程停止、释放。
这几个方法在在app中都已经重新实现了,重点分析surfaceChanged
  1. public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
  2.         // Make sure we have a surface in the holder before proceeding.
  3.         if (holder.getSurface() == null) {
  4.             Log.d(TAG, "holder.getSurface() == null");
  5.             return;
  6.         }

  7.         Log.v(TAG, "surfaceChanged. w=" + w + ". h=" + h);

  8.         // We need to save the holder for later use, even when the mCameraDevice
  9.         // is null. This could happen if onResume() is invoked after this
  10.         // function.
  11.         mSurfaceHolder = holder;

  12.         // The mCameraDevice will be null if it fails to connect to the camera
  13.         // hardware. In this case we will show a dialog and then finish the
  14.         // activity, so it's OK to ignore it.
  15.         if (mCameraDevice == null) return;

  16.         // Sometimes surfaceChanged is called after onPause or before onResume.
  17.         // Ignore it.
  18.         if (mPausing || isFinishing()) return;

  19.         setSurfaceLayout();

  20.         // Set preview display if the surface is being created. Preview was
  21.         // already started. Also restart the preview if display rotation has
  22.         // changed. Sometimes this happens when the device is held in portrait
  23.         // and camera app is opened. Rotation animation takes some time and
  24.         // display rotation in onCreate may not be what we want.
  25.         if (mCameraState == PREVIEW_STOPPED) {//这里check摄像头是否已经启动,第一次启动摄像头和摄像头已经打开从新进入摄像头实现方法不同
  26.             startPreview(true);
  27.             startFaceDetection();
  28.         } else {
  29.             if (Util.getDisplayRotation(this) != mDisplayRotation) {
  30.                 setDisplayOrientation();
  31.             }
  32.             if (holder.isCreating()) {
  33.                 // Set preview display if the surface is being created and preview
  34.                 // was already started. That means preview display was set to null
  35.                 // and we need to set it now.
  36.                 setPreviewDisplay(holder);
  37.             }
  38.         }

  39.         // If first time initialization is not finished, send a message to do
  40.         // it later. We want to finish surfaceChanged as soon as possible to let
  41.         // user see preview first.
  42.         if (!mFirstTimeInitialized) {
  43.             mHandler.sendEmptyMessage(FIRST_TIME_INIT);
  44.         } else {
  45.             initializeSecondTime();
  46.         }

  47.         SurfaceView preview = (SurfaceView) findViewById(R.id.camera_preview);
  48.         CameraInfo info = CameraHolder.instance().getCameraInfo()[mCameraId];
  49.         boolean mirror = (info.facing == CameraInfo.CAMERA_FACING_FRONT);
  50.         int displayRotation = Util.getDisplayRotation(this);
  51.         int displayOrientation = Util.getDisplayOrientation(displayRotation, mCameraId);

  52.         mTouchManager.initialize(preview.getHeight() / 3, preview.getHeight() / 3,
  53.                preview, this, mirror, displayOrientation);

  54.     }
以上标注部分是关键,现在直接分心startPreview方法,这是第一次打开摄像头的处理函数,进行了一些初始换,而已经处于摄像头打开状态时不必使用startPreview方法,而是用上面的另外一个分支,重新开始显示即可
  1. private void startPreview(boolean updateAll) {
  2.         if (mPausing || isFinishing()) return;

  3.         mFocusManager.resetTouchFocus();

  4.         mCameraDevice.setErrorCallback(mErrorCallback);

  5.         // If we're previewing already, stop the preview first (this will blank
  6.         // the screen).
  7.         if (mCameraState != PREVIEW_STOPPED) stopPreview();

  8.         setPreviewDisplay(mSurfaceHolder);
  9.         setDisplayOrientation();

  10.         if (!mSnapshotOnIdle) {
  11.             // If the focus mode is continuous autofocus, call cancelAutoFocus to
  12.             // resume it because it may have been paused by autoFocus call.
  13.             if (Parameters.FOCUS_MODE_CONTINUOUS_PICTURE.equals(mFocusManager.getFocusMode())) {
  14.                 mCameraDevice.cancelAutoFocus();
  15.             }
  16.             mFocusManager.setAeAwbLock(false); // Unlock AE and AWB.
  17.         }

  18.         if ( updateAll ) {
  19.             Log.v(TAG, "Updating all parameters!");
  20.             setCameraParameters(UPDATE_PARAM_INITIALIZE | UPDATE_PARAM_ZOOM | UPDATE_PARAM_PREFERENCE);
  21.         } else {
  22.             setCameraParameters(UPDATE_PARAM_MODE);
  23.         }

  24.         //setCameraParameters(UPDATE_PARAM_ALL);

  25.         // Inform the mainthread to go on the UI initialization.
  26.         if (mCameraPreviewThread != null) {
  27.             synchronized (mCameraPreviewThread) {
  28.                 mCameraPreviewThread.notify();
  29.             }
  30.         }

  31.         try {
  32.             Log.v(TAG, "startPreview");
  33.             mCameraDevice.startPreview();
  34.         } catch (Throwable ex) {
  35.             closeCamera();
  36.             throw new RuntimeException("startPreview failed", ex);
  37.         }

  38.         mZoomState = ZOOM_STOPPED;
  39.         setCameraState(IDLE);
  40.         mFocusManager.onPreviewStarted();
  41.         if ( mTempBracketingEnabled ) {
  42.             mFocusManager.setTempBracketingState(FocusManager.TempBracketingStates.ACTIVE);
  43.         }

  44.         if (mSnapshotOnIdle) {
  45.             mHandler.post(mDoSnapRunnable);
  46.         }
  47.     }
这里的思路是:先通过setPreviewDisplay方法将surface设定为window-player,这个方法会调用到HAL层,进行很重要的初始化,实现数据的回调

这里我必须得着重着重的进行分析,我一直在寻找是什么决定了overlay的使用与不适用,这里就这个setPreviewDisplay方法就是“罪魁祸首
在setPreview方法中传入的参数是surfaceview,这个surfaceview传到底层HAL层是参数形式发生了改变,但是在我的理解下,就是人换衣服一样,
张三今天换了一身衣服,但这个张三跟昨天穿不同衣服的张三是同一个人,到了HAL层这个参数的形式是preview_stream_ops ,下面慢慢你就可以知道了,
在camerahal中的setPreviewDisplay方法中,是通过判断传下来的的preview_stream_ops 参数是否为空决定使用overlay还是不适用overlay的,很重要的
这篇文章只是在这里提及一下,下面不会提及overlay的内容,默认是以不适用overlay的方式分析数据流的整个过程的,这里可千万别混淆了
使用overl的数据回流方式将单独作为一章分析,同时会详细分析使用和不适用overlay的最终决定权

流程如下:app-->frameworks-->通过JNI-->camera client-->camera service-->通过hardware-interface-->hal_module-->HAL
这里十分有必要看一下camera service层的调用过程:
  1. // set the Surface that the preview will use
  2. status_t CameraService::Client::setPreviewDisplay(const sp<Surface>& surface) {
  3.     LOG1("setPreviewDisplay(%p) (pid %d)", surface.get(), getCallingPid());

  4.     sp<IBinder> binder(surface != 0 ? surface->asBinder() : 0);
  5.     sp<ANativeWindow> window(surface);
  6.     return setPreviewWindow(binder, window);
  7. }
这里其实我还是理解的不是很透彻,将我们从app传进来的surface转换为IBinder和ANativiWindow,然后以这两个变量为参数接着调用参数不同的setPreviewWindow
  1. status_t CameraService::Client::setPreviewWindow(const sp<IBinder>& binder,
  2.         const sp<ANativeWindow>& window) {
  3.     Mutex::Autolock lock(mLock);
  4.     status_t result = checkPidAndHardware();
  5.     if (result != NO_ERROR) return result;

  6.     // return if no change in surface.
  7.     if (binder == mSurface) {
  8.         return NO_ERROR;
  9.     }

  10.     if (window != 0) {
  11.         result = native_window_api_connect(window.get(), NATIVE_WINDOW_API_CAMERA);
  12.         if (result != NO_ERROR) {
  13.             LOGE("native_window_api_connect failed: %s (%d)", strerror(-result),
  14.                     result);
  15.             return result;
  16.         }
  17.     }

  18.     // If preview has been already started, register preview buffers now.
  19.     if (mHardware->previewEnabled()) {
  20.         if (window != 0) {
  21.             native_window_set_scaling_mode(window.get(),
  22.                     NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);
  23.             native_window_set_buffers_transform(window.get(), mOrientation);
  24.             result = mHardware->setPreviewWindow(window);
  25.         }
  26.     }

  27.     if (result == NO_ERROR) {
  28.         // Everything has succeeded. Disconnect the old window and remember the
  29.         // new window.
  30.         disconnectWindow(mPreviewWindow);
  31.         mSurface = binder;
  32.         mPreviewWindow = window;
  33.     } else {
  34.         // Something went wrong after we connected to the new window, so
  35.         // disconnect here.
  36.         disconnectWindow(window);
  37.     }

  38.     return result;
  39. }
上面先调用到CameraHardwareInterface中的setPreview方法:

  1. status_t setPreviewWindow(const sp<ANativeWindow>& buf)
  2.     {
  3.         LOGV("%s(%s) buf %p", __FUNCTION__, mName.string(), buf.get());

  4.         if (mDevice->ops->set_preview_window) {
  5.             mPreviewWindow = buf;
  6. #ifdef OMAP_ENHANCEMENT_CPCAM
  7.             mHalPreviewWindow.user = mPreviewWindow.get();
  8. #else
  9.             mHalPreviewWindow.user = this;
  10. #endif
  11.             LOGV("%s &mHalPreviewWindow %p mHalPreviewWindow.user %p", __FUNCTION__,
  12.                     &mHalPreviewWindow, mHalPreviewWindow.user);
  13.             return mDevice->ops->set_preview_window(mDevice,
  14.                     buf.get() ? &mHalPreviewWindow.nw : 0);
  15.         }
  16.         return INVALID_OPERATION;
  17.     }
到这里为止,传输的参数已经由最初的surface-->ANativeWindow-->preview_stream_ops,传递到底层的参数已经发生了本质的变化,后面数据回调的时候还会见到这里变量,现在先记下它
其实我说的本质的变化这里也只能这么说,但往深入追究,这个preview_stream_ops也可以说只是surface的另外一种形式而已
这样才通过hardware调用到hal-module再调用到hal层
  1. int camera_set_preview_window(struct camera_device * device,
  2.         struct preview_stream_ops *window)
  3. {
  4.     int rv = -EINVAL;
  5.     ti_camera_device_t* ti_dev = NULL;

  6.     LOGV("%s", __FUNCTION__);

  7.     if(!device)
  8.         return rv;

  9.     ti_dev = (ti_camera_device_t*) device;

  10.     rv = gCameraHals[ti_dev->cameraid]->setPreviewWindow(window);

  11.     return rv;
  12. }
HAL层调用:
  1. status_t CameraHal::setPreviewWindow(struct preview_stream_ops *window)
  2. {
  3.     status_t ret = NO_ERROR;
  4.     CameraAdapter::BuffersDescriptor desc;

  5.     LOG_FUNCTION_NAME;
  6.     mSetPreviewWindowCalled = true;

  7.    //If the Camera service passes a null window, we destroy existing window and free the DisplayAdapter
  8.     if(!window)
  9.     {
  10.         if(mDisplayAdapter.get() != NULL)
  11.         {
  12.             ///NULL window passed, destroy the display adapter if present
  13.             CAMHAL_LOGD("NULL window passed, destroying display adapter");
  14.             mDisplayAdapter.clear();
  15.             ///@remarks If there was a window previously existing, we usually expect another valid window to be passed by the client
  16.             ///@remarks so, we will wait until it passes a valid window to begin the preview again
  17.             mSetPreviewWindowCalled = false;
  18.         }
  19.         CAMHAL_LOGD("NULL ANativeWindow passed to setPreviewWindow");
  20.         return NO_ERROR;
  21.     }else if(mDisplayAdapter.get() == NULL)
  22.     {
  23.         // Need to create the display adapter since it has not been created
  24.         // Create display adapter
  25.         mDisplayAdapter = new ANativeWindowDisplayAdapter();
  26.         ret = NO_ERROR;
  27.         if(!mDisplayAdapter.get() || ((ret=mDisplayAdapter->initialize())!=NO_ERROR))
  28.         {
  29.             if(ret!=NO_ERROR)
  30.             {
  31.                 mDisplayAdapter.clear();
  32.                 CAMHAL_LOGEA("DisplayAdapter initialize failed");
  33.                 LOG_FUNCTION_NAME_EXIT;
  34.                 return ret;
  35.             }
  36.             else
  37.             {
  38.                 CAMHAL_LOGEA("Couldn't create DisplayAdapter");
  39.                 LOG_FUNCTION_NAME_EXIT;
  40.                 return NO_MEMORY;
  41.             }
  42.         }

  43.         // DisplayAdapter needs to know where to get the CameraFrames from inorder to display
  44.         // Since CameraAdapter is the one that provides the frames, set it as the frame provider for DisplayAdapter
  45.         mDisplayAdapter->setFrameProvider(mCameraAdapter);

  46.         // Any dynamic errors that happen during the camera use case has to be propagated back to the application
  47.         // via CAMERA_MSG_ERROR. AppCallbackNotifier is the class that notifies such errors to the application
  48.         // Set it as the error handler for the DisplayAdapter
  49.         mDisplayAdapter->setErrorHandler(mAppCallbackNotifier.get());

  50.         // Update the display adapter with the new window that is passed from CameraService
  51.         ret = mDisplayAdapter->setPreviewWindow(window);
  52.         if(ret!=NO_ERROR)
  53.             {
  54.             CAMHAL_LOGEB("DisplayAdapter setPreviewWindow returned error %d", ret);
  55.             }

  56.         if(mPreviewStartInProgress)
  57.         {
  58.             CAMHAL_LOGDA("setPreviewWindow called when preview running");
  59.             // Start the preview since the window is now available
  60.             ret = startPreview();
  61.         }
  62.     } else {
  63.         // Update the display adapter with the new window that is passed from CameraService
  64.         ret = mDisplayAdapter->setPreviewWindow(window);
  65.         if ( (NO_ERROR == ret) && previewEnabled() ) {
  66.             restartPreview();
  67.         } else if (ret == ALREADY_EXISTS) {
  68.             // ALREADY_EXISTS should be treated as a noop in this case
  69.             ret = NO_ERROR;
  70.         }
  71.     }
  72.     LOG_FUNCTION_NAME_EXIT;

  73.     return ret;

  74. }
这里配置好显示数据来源,显示到目标,以及错误信息回调方法,最终开始preview

  1. status_t CameraHal::startPreview() {
  2.     LOG_FUNCTION_NAME;

  3.     // When tunneling is enabled during VTC, startPreview happens in 2 steps:
  4.     // When the application sends the command CAMERA_CMD_PREVIEW_INITIALIZATION,
  5.     // cameraPreviewInitialization() is called, which in turn causes the CameraAdapter
  6.     // to move from loaded to idle state. And when the application calls startPreview,
  7.     // the CameraAdapter moves from idle to executing state.
  8.     //
  9.     // If the application calls startPreview() without sending the command
  10.     // CAMERA_CMD_PREVIEW_INITIALIZATION, then the function cameraPreviewInitialization()
  11.     // AND startPreview() are executed. In other words, if the application calls
  12.     // startPreview() without sending the command CAMERA_CMD_PREVIEW_INITIALIZATION,
  13.     // then the CameraAdapter moves from loaded to idle to executing state in one shot.
  14.     status_t ret = cameraPreviewInitialization();这个地方十分重要,下面会具体分析

  15.     // The flag mPreviewInitializationDone is set to true at the end of the function
  16.     // cameraPreviewInitialization(). Therefore, if everything goes alright, then the
  17.     // flag will be set. Sometimes, the function cameraPreviewInitialization() may
  18.     // return prematurely if all the resources are not available for starting preview.
  19.     // For example, if the preview window is not set, then it would return NO_ERROR.
  20.     // Under such circumstances, one should return from startPreview as well and should
  21.     // not continue execution. That is why, we check the flag and not the return value.
  22.     if (!mPreviewInitializationDone) return ret;

  23.     // Once startPreview is called, there is no need to continue to remember whether
  24.     // the function cameraPreviewInitialization() was called earlier or not. And so
  25.     // the flag mPreviewInitializationDone is reset here. Plus, this preserves the
  26.     // current behavior of startPreview under the circumstances where the application
  27.     // calls startPreview twice or more.
  28.     mPreviewInitializationDone = false;

  29.     //Enable the display adapter if present, actual overlay enable happens when we post the buffer这里说overlay happens,我一直在找的地方,上面棕色标注将来会在详细说说这里
  30.     if(mDisplayAdapter.get() != NULL) {
  31.         CAMHAL_LOGDA("Enabling display");
  32.         int width, height;
  33.         mParameters.getPreviewSize(&width, &height);

  34. #if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS
  35.         ret = mDisplayAdapter->enableDisplay(width, height, &mStartPreview);
  36. #else
  37.         ret = mDisplayAdapter->enableDisplay(width, height, NULL);
  38. #endif

  39.         if ( ret != NO_ERROR ) {
  40.             CAMHAL_LOGEA("Couldn't enable display");

  41.             // FIXME: At this stage mStateSwitchLock is locked and unlock is supposed to be called
  42.             // only from mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW)
  43.             // below. But this will never happen because of goto error. Thus at next
  44.             // startPreview() call CameraHAL will be deadlocked.
  45.             // Need to revisit mStateSwitch lock, for now just abort the process.
  46.             CAMHAL_ASSERT_X(false,
  47.                 "At this stage mCameraAdapter->mStateSwitchLock is still locked, "
  48.                 "deadlock is guaranteed");

  49.             goto error;
  50.         }

  51.     }

  52.     CAMHAL_LOGDA("Starting CameraAdapter preview mode");
  53.     //Send START_PREVIEW command to adapter
  54.     ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW);//从这里开始调用到BaseCameraAdapter

  55.     if(ret!=NO_ERROR) {
  56.         CAMHAL_LOGEA("Couldn't start preview w/ CameraAdapter");
  57.         goto error;
  58.     }
  59.     CAMHAL_LOGDA("Started preview");

  60.     mPreviewEnabled = true;
  61.     mPreviewStartInProgress = false;
  62.     return ret;

  63.     error:

  64.         CAMHAL_LOGEA("Performing cleanup after error");

  65.         //Do all the cleanup
  66.         freePreviewBufs();
  67.         mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
  68.         if(mDisplayAdapter.get() != NULL) {
  69.             mDisplayAdapter->disableDisplay(false);
  70.         }
  71.         mAppCallbackNotifier->stop();
  72.         mPreviewStartInProgress = false;
  73.         mPreviewEnabled = false;
  74.         LOG_FUNCTION_NAME_EXIT;

  75.         return ret;
  76. }
BaseCameraAdapter实现了父类的sendcommand方法:

  1. case CameraAdapter::CAMERA_START_PREVIEW:
  2.             {

  3.                 CAMHAL_LOGDA("Start Preview");

  4.             if ( ret == NO_ERROR )
  5.                 {
  6.                 ret = setState(operation);
  7.                 }

  8.             if ( ret == NO_ERROR )
  9.                 {
  10.                 ret = startPreview();
  11.                 }

  12.             if ( ret == NO_ERROR )
  13.                 {
  14.                 ret = commitState();
  15.                 }
  16.             else
  17.                 {
  18.                 ret |= rollbackState();
  19.                 }

  20.             break;

  21.             }
这里我们接着分析startPreview方法,之前的文章中已经分析过,这里调用的startPreview方法不是BaseCameraAdapter中的startPreview,而是调用的V4LCameraAdapter中的startPreview方法:

  1. status_t V4LCameraAdapter::startPreview()
  2. {
  3.     status_t ret = NO_ERROR;

  4.     LOG_FUNCTION_NAME;
  5.     Mutex::Autolock lock(mPreviewBufsLock);

  6.     if(mPreviewing) {
  7.         ret = BAD_VALUE;
  8.         goto EXIT;
  9.     }

  10.     for (int i = 0; i < mPreviewBufferCountQueueable; i++) {

  11.         mVideoInfo->buf.index = i;
  12.         mVideoInfo->buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
  13.         mVideoInfo->buf.memory = V4L2_MEMORY_MMAP;

  14.         ret = v4lIoctl(mCameraHandle, VIDIOC_QBUF, &mVideoInfo->buf);//申请内存空间
  15.         if (ret < 0) {
  16.             CAMHAL_LOGEA("VIDIOC_QBUF Failed");
  17.             goto EXIT;
  18.         }
  19.         nQueued++;
  20.     }

  21.     ret = v4lStartStreaming();

  22.     // Create and start preview thread for receiving buffers from V4L Camera
  23.     if(!mCapturing) {
  24.         mPreviewThread = new PreviewThread(this);//开始preview线程
  25.         CAMHAL_LOGDA("Created preview thread");
  26.     }

  27.     //Update the flag to indicate we are previewing
  28.     mPreviewing = true;
  29.     mCapturing = false;

  30. EXIT:
  31.     LOG_FUNCTION_NAME_EXIT;
  32.     return ret;
  33. }

  1. status_t V4LCameraAdapter::v4lStartStreaming () {
  2.     status_t ret = NO_ERROR;
  3.     enum v4l2_buf_type bufType;

  4.     if (!mVideoInfo->isStreaming) {
  5.         bufType = V4L2_BUF_TYPE_VIDEO_CAPTURE;

  6.         ret = v4lIoctl (mCameraHandle, VIDIOC_STREAMON, &bufType);开始preview
  7.         if (ret < 0) {
  8.             CAMHAL_LOGEB("StartStreaming: Unable to start capture: %s", strerror(errno));
  9.             return ret;
  10.         }
  11.         mVideoInfo->isStreaming = true;
  12.     }
  13.     return ret;
  14. }
现在我们就看看开启的preview线程都在干什么:

  1. int V4LCameraAdapter::previewThread()
  2. {
  3.     status_t ret = NO_ERROR;
  4.     int width, height;
  5.     CameraFrame frame;
  6.     void *y_uv[2];
  7.     int index = 0;
  8.     int stride = 4096;
  9.     char *fp = NULL;

  10.     mParams.getPreviewSize(&width, &height);

  11.     if (mPreviewing) {

  12.         fp = this->GetFrame(index);
  13.         if(!fp) {
  14.             ret = BAD_VALUE;
  15.             goto EXIT;
  16.         }
  17.         CameraBuffer *buffer = mPreviewBufs.keyAt(index);
  18.         CameraFrame *lframe = (CameraFrame *)mFrameQueue.valueFor(buffer);
  19.         if (!lframe) {
  20.             ret = BAD_VALUE;
  21.             goto EXIT;
  22.         }

  23.         debugShowFPS();

  24.         if ( mFrameSubscribers.size() == 0 ) {
  25.             ret = BAD_VALUE;
  26.             goto EXIT;
  27.         }
  28.         //从这里开始以我的理解是进行数据的转换和保存操作
  29.         y_uv[0] = (void*) lframe->mYuv[0];
  30.         //y_uv[1] = (void*) lframe->mYuv[1];
  31.         //y_uv[1] = (void*) (lframe->mYuv[0] + height*stride);
  32.         convertYUV422ToNV12Tiler ( (unsigned char*)fp, (unsigned char*)y_uv[0], width, height);
  33.         CAMHAL_LOGVB("##...index= %d.;camera buffer= 0x%x; y= 0x%x; UV= 0x%x.",index, buffer, y_uv[0], y_uv[1] );

  34. #ifdef SAVE_RAW_FRAMES
  35.         unsigned char* nv12_buff = (unsigned char*) malloc(width*height*3/2);
  36.         //Convert yuv422i to yuv420sp(NV12) & dump the frame to a file
  37.         convertYUV422ToNV12 ( (unsigned char*)fp, nv12_buff, width, height);
  38.         saveFile( nv12_buff, ((width*height)*3/2) );
  39.         free (nv12_buff);
  40. #endif

  41.         frame.mFrameType = CameraFrame::PREVIEW_FRAME_SYNC;
  42.         frame.mBuffer = buffer;
  43.         frame.mLength = width*height*3/2;
  44.         frame.mAlignment = stride;
  45.         frame.mOffset = 0;
  46.         frame.mTimestamp = systemTime(SYSTEM_TIME_MONOTONIC);
  47.         frame.mFrameMask = (unsigned int)CameraFrame::PREVIEW_FRAME_SYNC;

  48.         if (mRecording)
  49.         {
  50.             frame.mFrameMask |= (unsigned int)CameraFrame::VIDEO_FRAME_SYNC;
  51.             mFramesWithEncoder++;
  52.         }

  53.         ret = setInitFrameRefCount(frame.mBuffer, frame.mFrameMask);
  54.         if (ret != NO_ERROR) {
  55.             CAMHAL_LOGDB("Error in setInitFrameRefCount %d", ret);
  56.         } else {
  57.             ret = sendFrameToSubscribers(&frame);
  58.         }
  59.     }
  60. EXIT:

  61.     return ret;
  62. }
就上面这段代码做一下说明,这里在我看来就是整个数据回流过程的中转站了,上面棕色部分buffer中拿到就就是底层driver返回回来的视频数据了,
那么我不是很明白的是,driver中的视频数据是怎么和mPreviewBufs还有index关联在一起的,并且这里可以通过buffer = mPreviewBufs.keyAt(index)获取到CameraBuffer,这里待会会详细探究一下
先接着往下说,获取到视频数据之后,如果需要,会将数据经过转换保存到file中方便之后使用,
最后使用得到的camerabuffer填充CameraFrame,这个结构至关重要,在我的理解,最终是通过sendFrameToSubscribers(&frame);方法将数据回流的

这里就先追踪一下driver中的视频数据是怎么和mPreviewBufs还有index关联在一起的
到了这里就不得不提及上面已经说的一个很重要的方法,先看看这个方法:
他是startPreview的第一步,cameraPreviewInitialization
  1. status_t CameraHal::cameraPreviewInitialization()
  2. {

  3.     status_t ret = NO_ERROR;
  4.     CameraAdapter::BuffersDescriptor desc;
  5.     CameraFrame frame;
  6.     unsigned int required_buffer_count;
  7.     unsigned int max_queueble_buffers;

  8. #if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS
  9.         gettimeofday(&mStartPreview, NULL);
  10. #endif

  11.     LOG_FUNCTION_NAME;

  12.     if (mPreviewInitializationDone) {
  13.         return NO_ERROR;
  14.     }

  15.     if ( mPreviewEnabled ){
  16.       CAMHAL_LOGDA("Preview already running");
  17.       LOG_FUNCTION_NAME_EXIT;
  18.       return ALREADY_EXISTS;
  19.     }

  20.     if ( NULL != mCameraAdapter ) {
  21.       ret = mCameraAdapter->setParameters(mParameters);配置参数到CameraAdapter
  22.     }

  23.     if ((mPreviewStartInProgress == false) && (mDisplayPaused == false)){
  24.       ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_RESOLUTION_PREVIEW,( int ) &frame);//通过这个command获取frame
  25.       if ( NO_ERROR != ret ){
  26.         CAMHAL_LOGEB("Error: CAMERA_QUERY_RESOLUTION_PREVIEW %d", ret);
  27.         return ret;
  28.       }

  29.       ///Update the current preview width and height
  30.       mPreviewWidth = frame.mWidth;//初始化宽和高
  31.       mPreviewHeight = frame.mHeight;
  32.     }

  33.     ///If we don't have the preview callback enabled and display adapter,
  34.     if(!mSetPreviewWindowCalled || (mDisplayAdapter.get() == NULL)){
  35.       CAMHAL_LOGD("Preview not started. Preview in progress flag set");
  36.       mPreviewStartInProgress = true;
  37.       ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_SWITCH_TO_EXECUTING);
  38.       if ( NO_ERROR != ret ){
  39.         CAMHAL_LOGEB("Error: CAMERA_SWITCH_TO_EXECUTING %d", ret);
  40.         return ret;
  41.       }
  42.       return NO_ERROR;
  43.     }

  44.     if( (mDisplayAdapter.get() != NULL) && ( !mPreviewEnabled ) && ( mDisplayPaused ) )
  45.         {
  46.         CAMHAL_LOGDA("Preview is in paused state");

  47.         mDisplayPaused = false;
  48.         mPreviewEnabled = true;
  49.         if ( NO_ERROR == ret )
  50.             {
  51.             ret = mDisplayAdapter->pauseDisplay(mDisplayPaused);

  52.             if ( NO_ERROR != ret )
  53.                 {
  54.                 CAMHAL_LOGEB("Display adapter resume failed %x", ret);
  55.                 }
  56.             }
  57.         //restart preview callbacks
  58.         if(mMsgEnabled & CAMERA_MSG_PREVIEW_FRAME)
  59.         {
  60.             mAppCallbackNotifier->enableMsgType (CAMERA_MSG_PREVIEW_FRAME);//
  61.         }

  62.         signalEndImageCapture();
  63.         return ret;
  64.         }

  65.     required_buffer_count = atoi(mCameraProperties->get(CameraProperties::REQUIRED_PREVIEW_BUFS));

  66.     ///Allocate the preview buffers
  67.     ret = allocPreviewBufs(mPreviewWidth, mPreviewHeight, mParameters.getPreviewFormat(), required_buffer_count, max_queueble_buffers);

  68.     if ( NO_ERROR != ret )
  69.         {
  70.         CAMHAL_LOGEA("Couldn't allocate buffers for Preview");
  71.         goto error;
  72.         }

  73.     if ( mMeasurementEnabled )
  74.         {

  75.         ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_BUFFER_SIZE_PREVIEW_DATA,
  76.                                           ( int ) &frame,
  77.                                           required_buffer_count);
  78.         if ( NO_ERROR != ret )
  79.             {
  80.             return ret;
  81.             }

  82.          ///Allocate the preview data buffers
  83.         ret = allocPreviewDataBufs(frame.mLength, required_buffer_count);
  84.         if ( NO_ERROR != ret ) {
  85.             CAMHAL_LOGEA("Couldn't allocate preview data buffers");
  86.             goto error;
  87.            }

  88.         if ( NO_ERROR == ret )
  89.             {
  90.             desc.mBuffers = mPreviewDataBuffers;
  91.             desc.mOffsets = mPreviewDataOffsets;
  92.             desc.mFd = mPreviewDataFd;
  93.             desc.mLength = mPreviewDataLength;
  94.             desc.mCount = ( size_t ) required_buffer_count;
  95.             desc.mMaxQueueable = (size_t) required_buffer_count;

  96.             mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW_DATA,
  97.                                         ( int ) &desc);
  98.             }

  99.         }

  100.     ///Pass the buffers to Camera Adapter
  101.     desc.mBuffers = mPreviewBuffers;
  102.     desc.mOffsets = mPreviewOffsets;
  103.     desc.mFd = mPreviewFd;
  104.     desc.mLength = mPreviewLength;
  105.     desc.mCount = ( size_t ) required_buffer_count;
  106.     desc.mMaxQueueable = (size_t) max_queueble_buffers;

  107.     ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW,( int ) &desc);

  108.     if ( NO_ERROR != ret )
  109.         {
  110.         CAMHAL_LOGEB("Failed to register preview buffers: 0x%x", ret);
  111.         freePreviewBufs();
  112.         return ret;
  113.         }

  114.     mAppCallbackNotifier->startPreviewCallbacks(mParameters, mPreviewBuffers, mPreviewOffsets, mPreviewFd, mPreviewLength, required_buffer_count);
  115.     ///Start the callback notifier
  116.     ret = mAppCallbackNotifier->start();

  117.     if( ALREADY_EXISTS == ret )
  118.         {
  119.         //Already running, do nothing
  120.         CAMHAL_LOGDA("AppCallbackNotifier already running");
  121.         ret = NO_ERROR;
  122.         }
  123.     else if ( NO_ERROR == ret ) {
  124.         CAMHAL_LOGDA("Started AppCallbackNotifier..");
  125.         mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);
  126.         }
  127.     else
  128.         {
  129.         CAMHAL_LOGDA("Couldn't start AppCallbackNotifier");
  130.         goto error;
  131.         }

  132.     if (ret == NO_ERROR) mPreviewInitializationDone = true;
  133.     return ret;

  134.     error:

  135.         CAMHAL_LOGEA("Performing cleanup after error");

  136.         //Do all the cleanup
  137.         freePreviewBufs();
  138.         mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
  139.         if(mDisplayAdapter.get() != NULL)
  140.             {
  141.             mDisplayAdapter->disableDisplay(false);
  142.             }
  143.         mAppCallbackNotifier->stop();
  144.         mPreviewStartInProgress = false;
  145.         mPreviewEnabled = false;
  146.         LOG_FUNCTION_NAME_EXIT;

  147.         return ret;
  148. }
这里先为preview buffer申请内存,并将preview set进cameraAdapter通过方法mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW,( int ) &desc)
在sendcommand中实现如下:

  1.             case CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW:
  2.                 CAMHAL_LOGDA("Use buffers for preview");
  3.                 desc = ( BuffersDescriptor * ) value1;

  4.                 if ( NULL == desc )
  5.                     {
  6.                     CAMHAL_LOGEA("Invalid preview buffers!");
  7.                     return -EINVAL;
  8.                     }

  9.                 if ( ret == NO_ERROR )
  10.                     {
  11.                     ret = setState(operation);
  12.                     }

  13.                 if ( ret == NO_ERROR )
  14.                     {
  15.                     Mutex::Autolock lock(mPreviewBufferLock);
  16.                     mPreviewBuffers = desc->mBuffers;
  17.                     mPreviewBuffersLength = desc->mLength;
  18.                     mPreviewBuffersAvailable.clear();
  19.                     mSnapshotBuffersAvailable.clear();
  20.                     for ( uint32_t i = 0 ; i < desc->mMaxQueueable ; i++ )
  21.                         {
  22.                         mPreviewBuffersAvailable.add(&mPreviewBuffers[i], 0);这里实现了mPreviewBuffersAvailable与mPreviewBuffers的关联
  23.                         }
  24.                     // initial ref count for undeqeueued buffers is 1 since buffer provider
  25.                     // is still holding on to it
  26.                     for ( uint32_t i = desc->mMaxQueueable ; i < desc->mCount ; i++ )
  27.                         {
  28.                         mPreviewBuffersAvailable.add(&mPreviewBuffers[i], 1);
  29.                         }
  30.                     }

  31.                 if ( NULL != desc )
  32.                     {
  33.                     ret = useBuffers(CameraAdapter::CAMERA_PREVIEW,
  34.                                      desc->mBuffers,
  35.                                      desc->mCount,
  36.                                      desc->mLength,
  37.                                      desc->mMaxQueueable);
  38.                     }

  39.                 if ( ret == NO_ERROR )
  40.                     {
  41.                     ret = commitState();
  42.                     }
  43.                 else
  44.                     {
  45.                     ret |= rollbackState();
  46.                     }

  47.                 break;
调用V4LCameraAdapter中的useBuffers方法,然后他会接着调用UseBuffersPreview方法:

  1. status_t V4LCameraAdapter::UseBuffersPreview(CameraBuffer *bufArr, int num)
  2. {
  3.     int ret = NO_ERROR;
  4.     LOG_FUNCTION_NAME;

  5.     if(NULL == bufArr) {
  6.         ret = BAD_VALUE;
  7.         goto EXIT;
  8.     }

  9.     ret = v4lInitMmap(num);
  10.     if (ret == NO_ERROR) {
  11.         for (int i = 0; i < num; i++) {
  12.             //Associate each Camera internal buffer with the one from Overlay
  13.             mPreviewBufs.add(&bufArr[i], i);//这里实现了mPreviewBufsdesc->mBuffers的关联
  14.             CAMHAL_LOGDB("Preview- buff [%d] = 0x%x ",i, mPreviewBufs.keyAt(i));
  15.         }

  16.         // Update the preview buffer count
  17.         mPreviewBufferCount = num;
  18.     }
  19. EXIT:
  20.     LOG_FUNCTION_NAME_EXIT;
  21.     return ret;
  22. }
在这里我们还是很有必要去深入研究一下mAppCallbackNotifier这个变量的初始化过程,他决定很多回调函数的初始化
的初始化是在哪里实现的呢??在在camerahal文件的initial中初始化的

  1. /**
  2.    @brief Initialize the Camera HAL

  3.    Creates CameraAdapter, AppCallbackNotifier, DisplayAdapter and MemoryManager

  4.    @param None
  5.    @return NO_ERROR - On success
  6.          NO_MEMORY - On failure to allocate memory for any of the objects
  7.    @remarks Camera Hal internal function

  8.  */

  9. status_t CameraHal::initialize(CameraProperties::Properties* properties)
  10. {
  11.     LOG_FUNCTION_NAME;

  12.     int sensor_index = 0;
  13.     const char* sensor_name = NULL;

  14.     ///Initialize the event mask used for registering an event provider for AppCallbackNotifier
  15.     ///Currently, registering all events as to be coming from CameraAdapter
  16.     int32_t eventMask = CameraHalEvent::ALL_EVENTS;

  17.     // Get my camera properties
  18.     mCameraProperties = properties;

  19.     if(!mCameraProperties)
  20.     {
  21.         goto fail_loop;
  22.     }

  23.     // Dump the properties of this Camera
  24.     // will only print if DEBUG macro is defined
  25.     mCameraProperties->dump();

  26.     if (strcmp(CameraProperties::DEFAULT_VALUE, mCameraProperties->get(CameraProperties::CAMERA_SENSOR_INDEX)) != 0 )
  27.         {
  28.         sensor_index = atoi(mCameraProperties->get(CameraProperties::CAMERA_SENSOR_INDEX));
  29.         }

  30.     if (strcmp(CameraProperties::DEFAULT_VALUE, mCameraProperties->get(CameraProperties::CAMERA_NAME)) != 0 ) {
  31.         sensor_name = mCameraProperties->get(CameraProperties::CAMERA_NAME);
  32.     }
  33.     CAMHAL_LOGDB("Sensor index= %d; Sensor name= %s", sensor_index, sensor_name);

  34.     if (strcmp(sensor_name, V4L_CAMERA_NAME_USB) == 0) {
  35. #ifdef V4L_CAMERA_ADAPTER
  36.         mCameraAdapter = V4LCameraAdapter_Factory(sensor_index);
  37. #endif
  38.     }
  39.     else {
  40. #ifdef OMX_CAMERA_ADAPTER
  41.         mCameraAdapter = OMXCameraAdapter_Factory(sensor_index);
  42. #endif
  43.     }

  44.     if ( ( NULL == mCameraAdapter ) || (mCameraAdapter->initialize(properties)!=NO_ERROR))
  45.         {
  46.         CAMHAL_LOGEA("Unable to create or initialize CameraAdapter");
  47.         mCameraAdapter = NULL;
  48.         goto fail_loop;
  49.         }

  50.     mCameraAdapter->incStrong(mCameraAdapter);
  51.     mCameraAdapter->registerImageReleaseCallback(releaseImageBuffers, (void *) this);
  52.     mCameraAdapter->registerEndCaptureCallback(endImageCapture, (void *)this);

  53.     if(!mAppCallbackNotifier.get())
  54.         {
  55.         /// Create the callback notifier
  56.         mAppCallbackNotifier = new AppCallbackNotifier();
  57.         if( ( NULL == mAppCallbackNotifier.get() ) || ( mAppCallbackNotifier->initialize() != NO_ERROR))
  58.             {
  59.             CAMHAL_LOGEA("Unable to create or initialize AppCallbackNotifier");
  60.             goto fail_loop;
  61.             }
  62.         }

  63.     if(!mMemoryManager.get())
  64.         {
  65.         /// Create Memory Manager
  66.         mMemoryManager = new MemoryManager();
  67.         if( ( NULL == mMemoryManager.get() ) || ( mMemoryManager->initialize() != NO_ERROR))
  68.             {
  69.             CAMHAL_LOGEA("Unable to create or initialize MemoryManager");
  70.             goto fail_loop;
  71.             }
  72.         }

  73.     ///Setup the class dependencies...

  74.     ///AppCallbackNotifier has to know where to get the Camera frames and the events like auto focus lock etc from.
  75.     ///CameraAdapter is the one which provides those events
  76.     ///Set it as the frame and event providers for AppCallbackNotifier
  77.     ///@remarks setEventProvider API takes in a bit mask of events for registering a provider for the different events
  78.     /// That way, if events can come from DisplayAdapter in future, we will be able to add it as provider
  79.     /// for any event
  80.     mAppCallbackNotifier->setEventProvider(eventMask, mCameraAdapter);
  81.     mAppCallbackNotifier->setFrameProvider(mCameraAdapter);

  82.     ///Any dynamic errors that happen during the camera use case has to be propagated back to the application
  83.     ///via CAMERA_MSG_ERROR. AppCallbackNotifier is the class that notifies such errors to the application
  84.     ///Set it as the error handler for CameraAdapter
  85.     mCameraAdapter->setErrorHandler(mAppCallbackNotifier.get());

  86.     ///Start the callback notifier
  87.     if(mAppCallbackNotifier->start() != NO_ERROR)
  88.       {
  89.         CAMHAL_LOGEA("Couldn't start AppCallbackNotifier");
  90.         goto fail_loop;
  91.       }

  92.     CAMHAL_LOGDA("Started AppCallbackNotifier..");
  93.     mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);

  94.     ///Initialize default parameters
  95.     initDefaultParameters();


  96.     if ( setParameters(mParameters) != NO_ERROR )
  97.         {
  98.         CAMHAL_LOGEA("Failed to set default parameters?!");
  99.         }

  100.     // register for sensor events
  101.     mSensorListener = new SensorListener();
  102.     if (mSensorListener.get()) {
  103.         if (mSensorListener->initialize() == NO_ERROR) {
  104.             mSensorListener->setCallbacks(orientation_cb, this);
  105.             mSensorListener->enableSensor(SensorListener::SENSOR_ORIENTATION);
  106.         } else {
  107.             CAMHAL_LOGEA("Error initializing SensorListener. not fatal, continuing");
  108.             mSensorListener.clear();
  109.             mSensorListener = NULL;
  110.         }
  111.     }

  112.     LOG_FUNCTION_NAME_EXIT;

  113.     return NO_ERROR;

  114.     fail_loop:

  115.         ///Free up the resources because we failed somewhere up
  116.         deinitialize();
  117.         LOG_FUNCTION_NAME_EXIT;

  118.         return NO_MEMORY;

  119. }
这里实例化了一些对象,我真正关注的是mAppCallbackNotifier这个对象,实例化这个对象,并且initialize,并且设置EventProvider和FrameProvider
我们就看一下setFrameProvider这个方法都做了什么事情,

  1. void AppCallbackNotifier::setFrameProvider(FrameNotifier *frameNotifier)
  2. {
  3.     LOG_FUNCTION_NAME;
  4.     ///@remarks There is no NULL check here. We will check
  5.     ///for NULL when we get the start command from CameraAdapter
  6.     mFrameProvider = new FrameProvider(frameNotifier, this, frameCallbackRelay);
  7.     if ( NULL == mFrameProvider )
  8.         {
  9.         CAMHAL_LOGEA("Error in creating FrameProvider");
  10.         }
  11.     else
  12.         {
  13.         //Register only for captured images and RAW for now
  14.         //TODO: Register for and handle all types of frames
  15.         mFrameProvider->enableFrameNotification(CameraFrame::IMAGE_FRAME);
  16.         mFrameProvider->enableFrameNotification(CameraFrame::RAW_FRAME);
  17.         }

  18.     LOG_FUNCTION_NAME_EXIT;
  19. }
实例化一个FrameProvider对象,并且enable响应notification,这里实例化的对象里面的参数中有一个frameCallbackRely是一个回调函数,这里我们暂且回过头看看previewthread中的那个方法sendFrameToSubscribers
这个方法只是调用了下面这个方法实现:
  1. status_t BaseCameraAdapter::__sendFrameToSubscribers(CameraFrame* frame,
  2.                                                      KeyedVector<int, frame_callback> *subscribers,
  3.                                                      CameraFrame::FrameType frameType)
  4. {
  5.     size_t refCount = 0;
  6.     status_t ret = NO_ERROR;
  7.     frame_callback callback = NULL;

  8.     frame->mFrameType = frameType;

  9.     if ( (frameType == CameraFrame::PREVIEW_FRAME_SYNC) ||
  10.          (frameType == CameraFrame::VIDEO_FRAME_SYNC) ||
  11.          (frameType == CameraFrame::SNAPSHOT_FRAME) ){
  12.         if (mFrameQueue.size() > 0){
  13.           CameraFrame *lframe = (CameraFrame *)mFrameQueue.valueFor(frame->mBuffer);
  14.           frame->mYuv[0] = lframe->mYuv[0];
  15.           frame->mYuv[1] = frame->mYuv[0] + (frame->mLength + frame->mOffset)*2/3;
  16.         }
  17.         else{
  18.           CAMHAL_LOGDA("Empty Frame Queue");
  19.           return -EINVAL;
  20.         }
  21.       }

  22.     if (NULL != subscribers) {
  23.         refCount = getFrameRefCount(frame->mBuffer, frameType);

  24.         if (refCount == 0) {
  25.             CAMHAL_LOGDA("Invalid ref count of 0");
  26.             return -EINVAL;
  27.         }

  28.         if (refCount > subscribers->size()) {
  29.             CAMHAL_LOGEB("Invalid ref count for frame type: 0x%x", frameType);
  30.             return -EINVAL;
  31.         }

  32.         CAMHAL_LOGVB("Type of Frame: 0x%x address: 0x%x refCount start %d",
  33.                      frame->mFrameType,
  34.                      ( uint32_t ) frame->mBuffer,
  35.                      refCount);

  36.         for ( unsigned int i = 0 ; i < refCount; i++ ) {
  37.             frame->mCookie = ( void * ) subscribers->keyAt(i);
  38.             callback = (frame_callback) subscribers->valueAt(i);

  39.             if (!callback) {
  40.                 CAMHAL_LOGEB("callback not set for frame type: 0x%x", frameType);
  41.                 return -EINVAL;
  42.             }

  43.             callback(frame);
  44.         }
  45.     } else {
  46.         CAMHAL_LOGEA("Subscribers is null??");
  47.         return -EINVAL;
  48.     }

  49.     return ret;
  50. }
最重要的部分我在上面已经表示出来,通过subscribers这个全局KeyedVector变量找到相应的frame->mCookie和callback方法,
这里所要获取到的callback方法就是上面setFrameProvider时引入的frameCallbackRelay这个函数,我们看看这个函数的具体实现

  1. void AppCallbackNotifier::frameCallbackRelay(CameraFrame* caFrame)
  2. {
  3.     LOG_FUNCTION_NAME;
  4.     AppCallbackNotifier *appcbn = (AppCallbackNotifier*) (caFrame->mCookie);
  5.     appcbn->frameCallback(caFrame);
  6.     LOG_FUNCTION_NAME_EXIT;
  7. }

  8. void AppCallbackNotifier::frameCallback(CameraFrame* caFrame)
  9. {
  10.     ///Post the event to the event queue of AppCallbackNotifier
  11.     TIUTILS::Message msg;
  12.     CameraFrame *frame;

  13.     LOG_FUNCTION_NAME;

  14.     if ( NULL != caFrame )
  15.         {

  16.         frame = new CameraFrame(*caFrame);
  17.         if ( NULL != frame )
  18.             {
  19.               msg.command = AppCallbackNotifier::NOTIFIER_CMD_PROCESS_FRAME;
  20.               msg.arg1 = frame;
  21.               mFrameQ.put(&msg);
  22.             }
  23.         else
  24.             {
  25.             CAMHAL_LOGEA("Not enough resources to allocate CameraFrame");
  26.             }

  27.         }

  28.     LOG_FUNCTION_NAME_EXIT;
  29. }
这个回调函数只是将数据装载入mag这个消息结构体,比且把这个消息put到mFrameQ这个全局消息中心,app就是从这个消息中心把数据取走的
我们可以看一下在AppCallbackNotifier初始化的时候就调用了initialize做一下初始设置

  1. /**
  2.   * NotificationHandler class
  3.   */

  4. ///Initialization function for AppCallbackNotifier
  5. status_t AppCallbackNotifier::initialize()
  6. {
  7.     LOG_FUNCTION_NAME;

  8.     mPreviewMemory = 0;

  9.     mMeasurementEnabled = false;

  10.     mNotifierState = NOTIFIER_STOPPED;

  11.     ///Create the app notifier thread
  12.     mNotificationThread = new NotificationThread(this);
  13.     if(!mNotificationThread.get())
  14.         {
  15.         CAMHAL_LOGEA("Couldn't create Notification thread");
  16.         return NO_MEMORY;
  17.         }

  18.     ///Start the display thread
  19.     status_t ret = mNotificationThread->run("NotificationThread", PRIORITY_URGENT_DISPLAY);
  20.     if(ret!=NO_ERROR)
  21.         {
  22.         CAMHAL_LOGEA("Couldn't run NotificationThread");
  23.         mNotificationThread.clear();
  24.         return ret;
  25.         }

  26.     mUseMetaDataBufferMode = true;
  27.     mRawAvailable = false;

  28.     mRecording = false;
  29.     mPreviewing = false;

  30.     LOG_FUNCTION_NAME_EXIT;

  31.     return ret;
  32. }
这个初始化的方法里面最重要就是开启了一个线程,用来监听HAL层发送来的一切消息,并在其中将消息或者数据告诉app,看看这个线程的具体实现

  1. bool AppCallbackNotifier::notificationThread()
  2. {
  3.     bool shouldLive = true;
  4.     status_t ret;

  5.     LOG_FUNCTION_NAME;

  6.     //CAMHAL_LOGDA("Notification Thread waiting for message");
  7.     ret = TIUTILS::MessageQueue::waitForMsg(&mNotificationThread->msgQ(),
  8.                                             &mEventQ,
  9.                                             &mFrameQ,
  10.                                             AppCallbackNotifier::NOTIFIER_TIMEOUT);

  11.     //CAMHAL_LOGDA("Notification Thread received message");

  12.     if (mNotificationThread->msgQ().hasMsg()) {
  13.         ///Received a message from CameraHal, process it
  14.         CAMHAL_LOGDA("Notification Thread received message from Camera HAL");
  15.         shouldLive = processMessage();
  16.         if(!shouldLive) {
  17.           CAMHAL_LOGDA("Notification Thread exiting.");
  18.           return shouldLive;
  19.         }
  20.     }

  21.     if(mEventQ.hasMsg()) {
  22.         ///Received an event from one of the event providers
  23.         CAMHAL_LOGDA("Notification Thread received an event from event provider (CameraAdapter)");
  24.         notifyEvent();
  25.      }

  26.     if(mFrameQ.hasMsg()) {
  27.        ///Received a frame from one of the frame providers
  28.        //CAMHAL_LOGDA("Notification Thread received a frame from frame provider (CameraAdapter)");
  29.        notifyFrame();
  30.     }

  31.     LOG_FUNCTION_NAME_EXIT;
  32.     return shouldLive;
  33. }
这里等待到有消息时,我们直接分析我需要的,preview,如果检测到mFrameQ中有消息,则调用notifyFrame方法

  1. void AppCallbackNotifier::notifyFrame()
  2. {
  3.     ///Receive and send the frame notifications to app
  4.     TIUTILS::Message msg;
  5.     CameraFrame *frame;
  6.     MemoryHeapBase *heap;
  7.     MemoryBase *buffer = NULL;
  8.     sp<MemoryBase> memBase;
  9.     void *buf = NULL;

  10.     LOG_FUNCTION_NAME;

  11.     {
  12.         Mutex::Autolock lock(mLock);
  13.         if(!mFrameQ.isEmpty()) {
  14.             mFrameQ.get(&msg);
  15.         } else {
  16.             return;
  17.         }
  18.     }

  19.     bool ret = true;

  20.     frame = NULL;
  21.     switch(msg.command)
  22.         {
  23.         case AppCallbackNotifier::NOTIFIER_CMD_PROCESS_FRAME:

  24.                 frame = (CameraFrame *) msg.arg1;
  25.                 if(!frame)
  26.                     {
  27.                     break;
  28.                     }

  29.                 if ( (CameraFrame::RAW_FRAME == frame->mFrameType )&&
  30.                     ( NULL != mCameraHal ) &&
  31.                     ( NULL != mDataCb) &&
  32.                     ( NULL != mNotifyCb ) )
  33.                     {

  34.                     if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE) )
  35.                         {
  36. #ifdef COPY_IMAGE_BUFFER
  37.                         copyAndSendPictureFrame(frame, CAMERA_MSG_RAW_IMAGE);
  38. #else
  39.                         //TODO: Find a way to map a Tiler buffer to a MemoryHeapBase
  40. #endif
  41.                         }
  42.                     else {
  43.                         if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE_NOTIFY) ) {
  44.                             mNotifyCb(CAMERA_MSG_RAW_IMAGE_NOTIFY, 0, 0, mCallbackCookie);
  45.                         }
  46.                         mFrameProvider->returnFrame(frame->mBuffer,
  47.                                                     (CameraFrame::FrameType) frame->mFrameType);
  48.                     }

  49.                     mRawAvailable = true;

  50.                     }
  51.                 else if ( (CameraFrame::IMAGE_FRAME == frame->mFrameType) &&
  52.                           (NULL != mCameraHal) &&
  53.                           (NULL != mDataCb) &&
  54.                           (CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG & frame->mQuirks) )
  55.                     {

  56.                     int encode_quality = 100, tn_quality = 100;
  57.                     int tn_width, tn_height;
  58.                     unsigned int current_snapshot = 0;
  59.                     Encoder_libjpeg::params *main_jpeg = NULL, *tn_jpeg = NULL;
  60.                     void* exif_data = NULL;
  61.                     const char *previewFormat = NULL;
  62.                     camera_memory_t* raw_picture = mRequestMemory(-1, frame->mLength, 1, NULL);

  63.                     if(raw_picture) {
  64.                         buf = raw_picture->data;
  65.                     }

  66.                     CameraParameters parameters;
  67.                     char *params = mCameraHal->getParameters();
  68.                     const String8 strParams(params);
  69.                     parameters.unflatten(strParams);

  70.                     encode_quality = parameters.getInt(CameraParameters::KEY_JPEG_QUALITY);
  71.                     if (encode_quality < 0 || encode_quality > 100) {
  72.                         encode_quality = 100;
  73.                     }

  74.                     tn_quality = parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_QUALITY);
  75.                     if (tn_quality < 0 || tn_quality > 100) {
  76.                         tn_quality = 100;
  77.                     }

  78.                     if (CameraFrame::HAS_EXIF_DATA & frame->mQuirks) {
  79.                         exif_data = frame->mCookie2;
  80.                     }

  81.                     main_jpeg = (Encoder_libjpeg::params*)
  82.                                     malloc(sizeof(Encoder_libjpeg::params));

  83.                     // Video snapshot with LDCNSF on adds a few bytes start offset
  84.                     // and a few bytes on every line. They must be skipped.
  85.                     int rightCrop = frame->mAlignment/2 - frame->mWidth;

  86.                     CAMHAL_LOGDB("Video snapshot right crop = %d", rightCrop);
  87.                     CAMHAL_LOGDB("Video snapshot offset = %d", frame->mOffset);

  88.                     if (main_jpeg) {
  89.                         main_jpeg->src = (uint8_t *)frame->mBuffer->mapped;
  90.                         main_jpeg->src_size = frame->mLength;
  91.                         main_jpeg->dst = (uint8_t*) buf;
  92.                         main_jpeg->dst_size = frame->mLength;
  93.                         main_jpeg->quality = encode_quality;
  94.                         main_jpeg->in_width = frame->mAlignment/2; // use stride here
  95.                         main_jpeg->in_height = frame->mHeight;
  96.                         main_jpeg->out_width = frame->mAlignment/2;
  97.                         main_jpeg->out_height = frame->mHeight;
  98.                         main_jpeg->right_crop = rightCrop;
  99.                         main_jpeg->start_offset = frame->mOffset;
  100.                         if ( CameraFrame::FORMAT_YUV422I_UYVY & frame->mQuirks) {
  101.                             main_jpeg->format = TICameraParameters::PIXEL_FORMAT_YUV422I_UYVY;
  102.                         }
  103.                         else { //if ( CameraFrame::FORMAT_YUV422I_YUYV & frame->mQuirks)
  104.                             main_jpeg->format = CameraParameters::PIXEL_FORMAT_YUV422I;
  105.                         }
  106.                     }

  107.                     tn_width = parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_WIDTH);
  108.                     tn_height = parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_HEIGHT);
  109.                     previewFormat = parameters.getPreviewFormat();

  110.                     if ((tn_width > 0) && (tn_height > 0) && ( NULL != previewFormat )) {
  111.                         tn_jpeg = (Encoder_libjpeg::params*)
  112.                                       malloc(sizeof(Encoder_libjpeg::params));
  113.                         // if malloc fails just keep going and encode main jpeg
  114.                         if (!tn_jpeg) {
  115.                             tn_jpeg = NULL;
  116.                         }
  117.                     }

  118.                     if (tn_jpeg) {
  119.                         int width, height;
  120.                         parameters.getPreviewSize(&width,&height);
  121.                         current_snapshot = (mPreviewBufCount + MAX_BUFFERS - 1) % MAX_BUFFERS;
  122.                         tn_jpeg->src = (uint8_t *)mPreviewBuffers[current_snapshot].mapped;
  123.                         tn_jpeg->src_size = mPreviewMemory->size / MAX_BUFFERS;
  124.                         tn_jpeg->dst_size = calculateBufferSize(tn_width,
  125.                                                                 tn_height,
  126.                                                                 previewFormat);
  127.                         tn_jpeg->dst = (uint8_t*) malloc(tn_jpeg->dst_size);
  128.                         tn_jpeg->quality = tn_quality;
  129.                         tn_jpeg->in_width = width;
  130.                         tn_jpeg->in_height = height;
  131.                         tn_jpeg->out_width = tn_width;
  132.                         tn_jpeg->out_height = tn_height;
  133.                         tn_jpeg->right_crop = 0;
  134.                         tn_jpeg->start_offset = 0;
  135.                         tn_jpeg->format = CameraParameters::PIXEL_FORMAT_YUV420SP;;
  136.                     }

  137.                     sp<Encoder_libjpeg> encoder = new Encoder_libjpeg(main_jpeg,
  138.                                                       tn_jpeg,
  139.                                                       AppCallbackNotifierEncoderCallback,
  140.                                                       (CameraFrame::FrameType)frame->mFrameType,
  141.                                                       this,
  142.                                                       raw_picture,
  143.                                                       exif_data, frame->mBuffer);
  144.                     gEncoderQueue.add(frame->mBuffer->mapped, encoder);
  145.                     encoder->run();
  146.                     encoder.clear();
  147.                     if (params != NULL)
  148.                       {
  149.                         mCameraHal->putParameters(params);
  150.                       }
  151.                     }
  152.                 else if ( ( CameraFrame::IMAGE_FRAME == frame->mFrameType ) &&
  153.                              ( NULL != mCameraHal ) &&
  154.                              ( NULL != mDataCb) )
  155.                     {

  156.                     // CTS, MTS requirements: Every 'takePicture()' call
  157.                     // who registers a raw callback should receive one
  158.                     // as well. This is not always the case with
  159.                     // CameraAdapters though.
  160.                     if (!mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE)) {
  161.                         dummyRaw();
  162.                     } else {
  163.                         mRawAvailable = false;
  164.                     }

  165. #ifdef COPY_IMAGE_BUFFER
  166.                     {
  167.                         Mutex::Autolock lock(mBurstLock);
  168. #if defined(OMAP_ENHANCEMENT)
  169.                         if ( mBurst )
  170.                         {
  171.                             copyAndSendPictureFrame(frame, CAMERA_MSG_COMPRESSED_BURST_IMAGE);
  172.                         }
  173.                         else
  174. #endif
  175.                         {
  176.                             copyAndSendPictureFrame(frame, CAMERA_MSG_COMPRESSED_IMAGE);
  177.                         }
  178.                     }
  179. #else
  180.                      //TODO: Find a way to map a Tiler buffer to a MemoryHeapBase
  181. #endif
  182.                     }
  183.                 else if ( ( CameraFrame::VIDEO_FRAME_SYNC == frame->mFrameType ) &&
  184.                              ( NULL != mCameraHal ) &&
  185.                              ( NULL != mDataCb) &&
  186.                              ( mCameraHal->msgTypeEnabled(CAMERA_MSG_VIDEO_FRAME) ) )
  187.                     {
  188.                     AutoMutex locker(mRecordingLock);
  189.                     if(mRecording)
  190.                         {
  191.                         if(mUseMetaDataBufferMode)
  192.                             {
  193.                             camera_memory_t *videoMedatadaBufferMemory =
  194.                                              mVideoMetadataBufferMemoryMap.valueFor(frame->mBuffer->opaque);
  195.                             video_metadata_t *videoMetadataBuffer = (video_metadata_t *) videoMedatadaBufferMemory->data;

  196.                             if( (NULL == videoMedatadaBufferMemory) || (NULL == videoMetadataBuffer) || (NULL == frame->mBuffer) )
  197.                                 {
  198.                                 CAMHAL_LOGEA("Error! One of the video buffers is NULL");
  199.                                 break;
  200.                                 }

  201.                             if ( mUseVideoBuffers )
  202.                               {
  203.                                 CameraBuffer *vBuf = mVideoMap.valueFor(frame->mBuffer->opaque);
  204.                                 GraphicBufferMapper &mapper = GraphicBufferMapper::get();
  205.                                 Rect bounds;
  206.                                 bounds.left = 0;
  207.                                 bounds.top = 0;
  208.                                 bounds.right = mVideoWidth;
  209.                                 bounds.bottom = mVideoHeight;

  210.                                 void *y_uv[2];
  211.                                 mapper.lock((buffer_handle_t)vBuf, CAMHAL_GRALLOC_USAGE, bounds, y_uv);
  212.                                 y_uv[1] = y_uv[0] + mVideoHeight*4096;

  213.                                 structConvImage input = {frame->mWidth,
  214.                                                           frame->mHeight,
  215.                                                           4096,
  216.                                                           IC_FORMAT_YCbCr420_lp,
  217.                                                           (mmByte *)frame->mYuv[0],
  218.                                                           (mmByte *)frame->mYuv[1],
  219.                                                           frame->mOffset};

  220.                                 structConvImage output = {mVideoWidth,
  221.                                                           mVideoHeight,
  222.                                                           4096,
  223.                                                           IC_FORMAT_YCbCr420_lp,
  224.                                                           (mmByte *)y_uv[0],
  225.                                                           (mmByte *)y_uv[1],
  226.                                                           0};

  227.                                 VT_resizeFrame_Video_opt2_lp(&input, &output, NULL, 0);
  228.                                 mapper.unlock((buffer_handle_t)vBuf->opaque);
  229.                                 videoMetadataBuffer->metadataBufferType = (int) kMetadataBufferTypeCameraSource;
  230.                                 /* FIXME remove cast */
  231.                                 videoMetadataBuffer->handle = (void *)vBuf->opaque;
  232.                                 videoMetadataBuffer->offset = 0;
  233.                               }
  234.                             else
  235.                               {
  236.                                 videoMetadataBuffer->metadataBufferType = (int) kMetadataBufferTypeCameraSource;
  237.                                 videoMetadataBuffer->handle = camera_buffer_get_omx_ptr(frame->mBuffer);
  238.                                 videoMetadataBuffer->offset = frame->mOffset;
  239.                               }

  240.                             CAMHAL_LOGVB("mDataCbTimestamp : frame->mBuffer=0x%x, videoMetadataBuffer=0x%x, videoMedatadaBufferMemory=0x%x",
  241.                                             frame->mBuffer->opaque, videoMetadataBuffer, videoMedatadaBufferMemory);

  242.                             mDataCbTimestamp(frame->mTimestamp, CAMERA_MSG_VIDEO_FRAME,
  243.                                                 videoMedatadaBufferMemory, 0, mCallbackCookie);
  244.                             }
  245.                         else
  246.                             {
  247.                             //TODO: Need to revisit this, should ideally be mapping the TILER buffer using mRequestMemory
  248.                             camera_memory_t* fakebuf = mRequestMemory(-1, sizeof(buffer_handle_t), 1, NULL);
  249.                             if( (NULL == fakebuf) || ( NULL == fakebuf->data) || ( NULL == frame->mBuffer))
  250.                                 {
  251.                                 CAMHAL_LOGEA("Error! One of the video buffers is NULL");
  252.                                 break;
  253.                                 }

  254.                             *reinterpret_cast<buffer_handle_t*>(fakebuf->data) = reinterpret_cast<buffer_handle_t>(frame->mBuffer->mapped);
  255.                             mDataCbTimestamp(frame->mTimestamp, CAMERA_MSG_VIDEO_FRAME, fakebuf, 0, mCallbackCookie);
  256.                             fakebuf->release(fakebuf);
  257.                             }
  258.                         }
  259.                     }
  260.                 else if(( CameraFrame::SNAPSHOT_FRAME == frame->mFrameType ) &&
  261.                              ( NULL != mCameraHal ) &&
  262.                              ( NULL != mDataCb) &&
  263.                              ( NULL != mNotifyCb)) {
  264.                     //When enabled, measurement data is sent instead of video data
  265.                     if ( !mMeasurementEnabled ) {
  266.                         copyAndSendPreviewFrame(frame, CAMERA_MSG_POSTVIEW_FRAME);
  267.                     } else {
  268.                         mFrameProvider->returnFrame(frame->mBuffer,
  269.                                                     (CameraFrame::FrameType) frame->mFrameType);
  270.                     }
  271.                 }
  272.                 else if ( ( CameraFrame::PREVIEW_FRAME_SYNC== frame->mFrameType ) &&
  273.                             ( NULL != mCameraHal ) &&
  274.                             ( NULL != mDataCb) &&
  275.                             ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) {
  276.                     //When enabled, measurement data is sent instead of video data
  277.                     if ( !mMeasurementEnabled ) {
  278.                         copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);
  279.                     } else {
  280.                          mFrameProvider->returnFrame(frame->mBuffer,
  281.                                                      (CameraFrame::FrameType) frame->mFrameType);
  282.                     }
  283.                 }
  284.                 else if ( ( CameraFrame::FRAME_DATA_SYNC == frame->mFrameType ) &&
  285.                             ( NULL != mCameraHal ) &&
  286.                             ( NULL != mDataCb) &&
  287.                             ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) {
  288.                     copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);
  289.                 } else {
  290.                     mFrameProvider->returnFrame(frame->mBuffer,
  291.                                                 ( CameraFrame::FrameType ) frame->mFrameType);
  292.                     CAMHAL_LOGDB("Frame type 0x%x is still unsupported!", frame->mFrameType);
  293.                 }

  294.                 break;

  295.         default:

  296.             break;

  297.         };

  298. exit:

  299.     if ( NULL != frame )
  300.         {
  301.         delete frame;
  302.         }

  303.     LOG_FUNCTION_NAME_EXIT;
  304. }
这里对不同的操作方式做了不同的处理,我们还是先通过preview过程作分析,如上面标注的那样处理,这里看看copyAndSendPreviewFrame的实现方法:

  1. void AppCallbackNotifier::copyAndSendPreviewFrame(CameraFrame* frame, int32_t msgType)
  2. {
  3.     camera_memory_t* picture = NULL;
  4.     CameraBuffer * dest = NULL;

  5.     // scope for lock
  6.     {
  7.         Mutex::Autolock lock(mLock);

  8.         if(mNotifierState != AppCallbackNotifier::NOTIFIER_STARTED) {
  9.             goto exit;
  10.         }

  11.         if (!mPreviewMemory || !frame->mBuffer) {
  12.             CAMHAL_LOGDA("Error! One of the buffer is NULL");
  13.             goto exit;
  14.         }

  15.         dest = &mPreviewBuffers[mPreviewBufCount];

  16.         CAMHAL_LOGVB("%d:copy2Dto1D(%p, %p, %d, %d, %d, %d, %d,%s)",
  17.                      __LINE__,
  18.                       dest,
  19.                       frame->mBuffer,
  20.                       mPreviewWidth,
  21.                       mPreviewHeight,
  22.                       mPreviewStride,
  23.                       2,
  24.                       frame->mLength,
  25.                       mPreviewPixelFormat);

  26.         /* FIXME map dest */
  27.         if ( NULL != dest && dest->mapped != NULL ) {
  28.             // data sync frames don't need conversion
  29.             if (CameraFrame::FRAME_DATA_SYNC == frame->mFrameType) {
  30.                 if ( (mPreviewMemory->size / MAX_BUFFERS) >= frame->mLength ) {
  31.                     memcpy(dest->mapped, (void*) frame->mBuffer->mapped, frame->mLength);
  32.                 } else {
  33.                     memset(dest->mapped, 0, (mPreviewMemory->size / MAX_BUFFERS));
  34.                 }
  35.             } else {
  36.               if ((NULL == frame->mYuv[0]) || (NULL == frame->mYuv[1])){
  37.                 CAMHAL_LOGEA("Error! One of the YUV Pointer is NULL");
  38.                 goto exit;
  39.               }
  40.               else{
  41.                 copy2Dto1D(dest->mapped,
  42.                            frame->mYuv,
  43.                            mPreviewWidth,
  44.                            mPreviewHeight,
  45.                            mPreviewStride,
  46.                            frame->mOffset,
  47.                            2,
  48.                            frame->mLength,
  49.                            mPreviewPixelFormat);
  50.               }
  51.             }
  52.         }
  53.     }

  54.  exit:
  55.     mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType) frame->mFrameType);

  56.     if((mNotifierState == AppCallbackNotifier::NOTIFIER_STARTED) &&
  57.        mCameraHal->msgTypeEnabled(msgType) &&
  58.        (dest != NULL) && (dest->mapped != NULL)) {
  59.         AutoMutex locker(mLock);
  60.         if ( mPreviewMemory )
  61.             mDataCb(msgType, mPreviewMemory, mPreviewBufCount, NULL, mCallbackCookie);
  62.     }

  63.     // increment for next buffer
  64.     mPreviewBufCount = (mPreviewBufCount + 1) % AppCallbackNotifier::MAX_BUFFERS;
  65. }
具体中间的实现过程这里就先不做具体分析,我们只看最后这一个标注的方法mDataCb,他的定义是这样的
camera_data_callback   mDataCb;
这个方法的实现很关键,其实他是在cameraservice中实现,过程如下:
1.cameraservice中调用mHardware->setCallbacks(notifyCallback,dataCallback, dataCallbackTimestamp, (void *)cameraId);
2.camerahardwareinterface中调用mDevice->ops->set_callbacks(mDevice,__notify_cb,__data_cb, __data_cb_timestamp, __get_memory, this);
3.camerahal_module中调用gCameraHals[ti_dev->cameraid]->setCallbacks(notify_cb, data_cb, data_cb_timestamp, get_memory, user);
4.camerahal中调用mAppCallbackNotifier->setCallbacks(this,notify_cb, data_cb, data_cb_timestamp, get_memory, user);
5.这里就到了appcallbacknotifier中,我们就看看这个setcallbacks的实现吧

  1. void AppCallbackNotifier::setCallbacks(CameraHal* cameraHal,
  2.                                         camera_notify_callback notify_cb,
  3.                                         camera_data_callback data_cb,
  4.                                         camera_data_timestamp_callback data_cb_timestamp,
  5.                                         camera_request_memory get_memory,
  6.                                         void *user)
  7. {
  8.     Mutex::Autolock lock(mLock);

  9.     LOG_FUNCTION_NAME;

  10.     mCameraHal = cameraHal;
  11.     mNotifyCb = notify_cb;
  12.     mDataCb = data_cb;
  13.     mDataCbTimestamp = data_cb_timestamp;
  14.     mRequestMemory = get_memory;
  15.     mCallbackCookie = user;

  16.     LOG_FUNCTION_NAME_EXIT;
  17. }
这里可以很清楚的看到mDataCb就是指向cameraservice中定义的回调函数,也就是通过这种机制,最终底层获取的数据现在已经传递到cameraservice层了
不过这里还是要很注意一点,我上面我说的mDataCb就是指向cameraservice中定义的回调函数也是不准确的说法,准确的说法应该是mDataCb的实现方法最红会调用到cameraservice中定义的回调函数,
这里还是花点时间说明一下这个回调过程:
mDataCb其实真正知道的是camerahardwareinterface中定义的__data_cb实现,这是由以下调用决定的
mDevice->ops->set_callbacks(mDevice,
                                   __notify_cb,
                                   __data_cb,
                                   __data_cb_timestamp,
                                   __get_memory,
                                   this);
下面来看看的__data_cb定义

  1. static void __data_cb(int32_t msg_type,
  2.                           const camera_memory_t *data, unsigned int index,
  3.                           camera_frame_metadata_t *metadata,
  4.                           void *user)
  5.     {
  6.         LOGV("%s", __FUNCTION__);
  7.         CameraHardwareInterface *__this =
  8.                 static_cast<CameraHardwareInterface *>(user);
  9.         sp<CameraHeapMemory> mem(static_cast<CameraHeapMemory *>(data->handle));
  10.         if (index >= mem->mNumBufs) {
  11.             LOGE("%s: invalid buffer index %d, max allowed is %d", __FUNCTION__,
  12.                  index, mem->mNumBufs);
  13.             return;
  14.         }
  15.         __this->mDataCb(msg_type, mem->mBuffers[index], metadata, __this->mCbUser);
  16.     }
而这里这个mDataCb又是哪里来的

  1. /** Set the notification and data callbacks */
  2.     void setCallbacks(notify_callback notify_cb,
  3.                       data_callback data_cb,
  4.                       data_callback_timestamp data_cb_timestamp,
  5.                       void* user)
  6.     {
  7.         mNotifyCb = notify_cb;
  8.         mDataCb = data_cb;
  9.         mDataCbTimestamp = data_cb_timestamp;
  10.         mCbUser = user;

  11.         LOGV("%s(%s)", __FUNCTION__, mName.string());

  12.         if (mDevice->ops->set_callbacks) {
  13.             mDevice->ops->set_callbacks(mDevice,
  14.                                    __notify_cb,
  15.                                    __data_cb,
  16.                                    __data_cb_timestamp,
  17.                                    __get_memory,
  18.                                    this);
  19.         }
  20.     }
找里大家清楚的看到了mDataCb就指向cameraservice中定义的datacallback方法,这个过程只是稍微绕了下圈

这里还是继续分析吧,看看数据到底是怎样送到app的,下面看看cameraservice中的这个datacallback方法的定义

  1. void CameraService::Client::dataCallback(int32_t msgType,
  2.         const sp<IMemory>& dataPtr, camera_frame_metadata_t *metadata, void* user) {
  3.     LOG2("dataCallback(%d)", msgType);

  4.     sp<Client> client = getClientFromCookie(user);
  5.     if (client == 0) return;
  6.     if (!client->lockIfMessageWanted(msgType)) return;

  7.     if (dataPtr == 0 && metadata == NULL) {
  8.         LOGE("Null data returned in data callback");
  9.         client->handleGenericNotify(CAMERA_MSG_ERROR, UNKNOWN_ERROR, 0);
  10.         return;
  11.     }

  12.     switch (msgType & ~CAMERA_MSG_PREVIEW_METADATA) {
  13.         case CAMERA_MSG_PREVIEW_FRAME:
  14.             client->handlePreviewData(msgType, dataPtr, metadata);
  15.             break;
  16.         case CAMERA_MSG_POSTVIEW_FRAME:
  17.             client->handlePostview(dataPtr);
  18.             break;
  19.         case CAMERA_MSG_RAW_IMAGE:
  20.             client->handleRawPicture(dataPtr);
  21.             break;
  22.         case CAMERA_MSG_COMPRESSED_IMAGE:
  23.             client->handleCompressedPicture(dataPtr);
  24.             break;
  25. #ifdef OMAP_ENHANCEMENT
  26.         case CAMERA_MSG_COMPRESSED_BURST_IMAGE:
  27.             client->handleCompressedBurstPicture(dataPtr);
  28.             break;
  29. #endif
  30.         default:
  31.             client->handleGenericData(msgType, dataPtr, metadata);
  32.             break;
  33.     }
  34. }
preview数据是通过上面的client->handlePreviewData(msgType, dataPtr, metadata);方法继续往上走的,通过这种方式,数据由cameras层---->cameraclient层
接着看看cameraclent的实现

  1. // callback from camera service when frame or image is ready
  2. void Camera::dataCallback(int32_t msgType, const sp<IMemory>& dataPtr,
  3.                           camera_frame_metadata_t *metadata)
  4. {
  5.     sp<CameraListener> listener;
  6.     {
  7.         Mutex::Autolock _l(mLock);
  8.         listener = mListener;
  9.     }
  10.     if (listener != NULL) {
  11.         listener->postData(msgType, dataPtr, metadata);
  12.     }
  13. }
这里的listener到底是什么,还记得初始化的时候,在jni里面有设置listenerm吗?我们还是从新再看一下吧:frameworks/base/core/jni/android_hardware_Camera.cpp
  1. // connect to camera service
  2. static void android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
  3.     jobject weak_this, jint cameraId)
  4. {
  5.     sp<Camera> camera = Camera::connect(cameraId);

  6.     if (camera == NULL) {
  7.         jniThrowRuntimeException(env, "Fail to connect to camera service");
  8.         return;
  9.     }

  10.     // make sure camera hardware is alive
  11.     if (camera->getStatus() != NO_ERROR) {
  12.         jniThrowRuntimeException(env, "Camera initialization failed");
  13.         return;
  14.     }

  15.     jclass clazz = env->GetObjectClass(thiz);
  16.     if (clazz == NULL) {
  17.         jniThrowRuntimeException(env, "Can't find android/hardware/Camera");
  18.         return;
  19.     }

  20.     // We use a weak reference so the Camera object can be garbage collected.
  21.     // The reference is only used as a proxy for callbacks.
  22.     sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera);
  23.     context->incStrong(thiz);
  24.     camera->setListener(context);

  25.     // save context in opaque field
  26.     env->SetIntField(thiz, fields.context, (int)context.get());
  27. }
由上面可以看出JNICameraContext是个监听类,同时set这个监听类,这个类的定义在:frameworks/base/core/jni/android_hardware_Camera.cpp
  1. // provides persistent context for calls from native code to Java
  2. class JNICameraContext: public CameraListener
  3. {
  4. public:
  5.     JNICameraContext(JNIEnv* env, jobject weak_this, jclass clazz, const sp<Camera>& camera);
  6.     ~JNICameraContext() { release(); }
  7.     virtual void notify(int32_t msgType, int32_t ext1, int32_t ext2);
  8.     virtual void postData(int32_t msgType, const sp<IMemory>& dataPtr,
  9.                           camera_frame_metadata_t *metadata);
  10.     virtual void postDataTimestamp(nsecs_t timestamp, int32_t msgType, const sp<IMemory>& dataPtr);
  11.     void postMetadata(JNIEnv *env, int32_t msgType, camera_frame_metadata_t *metadata);
  12.     void addCallbackBuffer(JNIEnv *env, jbyteArray cbb, int msgType);
  13.     void setCallbackMode(JNIEnv *env, bool installed, bool manualMode);
  14.     sp<Camera> getCamera() { Mutex::Autolock _l(mLock); return mCamera; }
  15.     bool isRawImageCallbackBufferAvailable() const;
  16.     void release();

  17. private:
  18.     void copyAndPost(JNIEnv* env, const sp<IMemory>& dataPtr, int msgType);
  19.     void clearCallbackBuffers_l(JNIEnv *env, Vector<jbyteArray> *buffers);
  20.     void clearCallbackBuffers_l(JNIEnv *env);
  21.     jbyteArray getCallbackBuffer(JNIEnv *env, Vector<jbyteArray> *buffers, size_t bufferSize);

  22.     jobject mCameraJObjectWeak; // weak reference to java object
  23.     jclass mCameraJClass; // strong reference to java class
  24.     sp<Camera> mCamera; // strong reference to native object
  25.     jclass mFaceClass; // strong reference to Face class
  26.     jclass mRectClass; // strong reference to Rect class
  27.     Mutex mLock;

  28.     /*
  29.      * Global reference application-managed raw image buffer queue.
  30.      *
  31.      * Manual-only mode is supported for raw image callbacks, which is
  32.      * set whenever method addCallbackBuffer() with msgType =
  33.      * CAMERA_MSG_RAW_IMAGE is called; otherwise, null is returned
  34.      * with raw image callbacks.
  35.      */
  36.     Vector<jbyteArray> mRawImageCallbackBuffers;

  37.     /*
  38.      * Application-managed preview buffer queue and the flags
  39.      * associated with the usage of the preview buffer callback.
  40.      */
  41.     Vector<jbyteArray> mCallbackBuffers; // Global reference application managed byte[]
  42.     bool mManualBufferMode; // Whether to use application managed buffers.
  43.     bool mManualCameraCallbackSet; // Whether the callback has been set, used to
  44.                                          // reduce unnecessary calls to set the callback.
  45. };
标注部分是我们在上面用到的postData,我们看一看postData的实现过程:
  1. void JNICameraContext::postData(int32_t msgType, const sp<IMemory>& dataPtr,
  2.                                 camera_frame_metadata_t *metadata)
  3. {
  4.     // VM pointer will be NULL if object is released
  5.     Mutex::Autolock _l(mLock);
  6.     JNIEnv *env = AndroidRuntime::getJNIEnv();
  7.     if (mCameraJObjectWeak == NULL) {
  8.         LOGW("callback on dead camera object");
  9.         return;
  10.     }

  11.     int32_t dataMsgType = msgType & ~CAMERA_MSG_PREVIEW_METADATA;

  12.     // return data based on callback type
  13.     switch (dataMsgType) {
  14.         case CAMERA_MSG_VIDEO_FRAME:
  15.             // should never happen
  16.             break;

  17.         // For backward-compatibility purpose, if there is no callback
  18.         // buffer for raw image, the callback returns null.
  19.         case CAMERA_MSG_RAW_IMAGE:
  20.             LOGV("rawCallback");
  21.             if (mRawImageCallbackBuffers.isEmpty()) {
  22.                 env->CallStaticVoidMethod(mCameraJClass, fields.post_event,
  23.                         mCameraJObjectWeak, dataMsgType, 0, 0, NULL);
  24.             } else {
  25.                 copyAndPost(env, dataPtr, dataMsgType);
  26.             }
  27.             break;

  28.         // There is no data.
  29.         case 0:
  30.             break;

  31.         default:
  32.             LOGV("dataCallback(%d, %p)", dataMsgType, dataPtr.get());
  33.             copyAndPost(env, dataPtr, dataMsgType);
  34.             break;
  35.     }

  36.     // post frame metadata to Java
  37.     if (metadata && (msgType & CAMERA_MSG_PREVIEW_METADATA)) {
  38.         postMetadata(env, CAMERA_MSG_PREVIEW_METADATA, metadata);
  39.     }
  40. }
我们接着看看这个copyAndPost方法:
  1. void JNICameraContext::copyAndPost(JNIEnv* env, const sp<IMemory>& dataPtr, int msgType)
  2. {
  3.     jbyteArray obj = NULL;

  4.     // allocate Java byte array and copy data
  5.     if (dataPtr != NULL) {
  6.         ssize_t offset;
  7.         size_t size;
  8.         sp<IMemoryHeap> heap = dataPtr->getMemory(&offset, &size);
  9.         LOGV("copyAndPost: off=%ld, size=%d", offset, size);
  10.         uint8_t *heapBase = (uint8_t*)heap->base();

  11.         if (heapBase != NULL) {
  12.             const jbyte* data = reinterpret_cast<const jbyte*>(heapBase + offset);

  13.             if (msgType == CAMERA_MSG_RAW_IMAGE) {
  14.                 obj = getCallbackBuffer(env, &mRawImageCallbackBuffers, size);
  15.             } else if (msgType == CAMERA_MSG_PREVIEW_FRAME && mManualBufferMode) {
  16.                 obj = getCallbackBuffer(env, &mCallbackBuffers, size);

  17.                 if (mCallbackBuffers.isEmpty()) {
  18.                     LOGV("Out of buffers, clearing callback!");
  19.                     mCamera->setPreviewCallbackFlags(CAMERA_FRAME_CALLBACK_FLAG_NOOP);
  20.                     mManualCameraCallbackSet = false;

  21.                     if (obj == NULL) {
  22.                         return;
  23.                     }
  24.                 }
  25.             } else {
  26.                 LOGV("Allocating callback buffer");
  27.                 obj = env->NewByteArray(size);
  28.             }

  29.             if (obj == NULL) {
  30.                 LOGE("Couldn't allocate byte array for JPEG data");
  31.                 env->ExceptionClear();
  32.             } else {
  33.                 env->SetByteArrayRegion(obj, 0, size, data);
  34.             }
  35.         } else {
  36.             LOGE("image heap is NULL");
  37.         }
  38.     }

  39.     // post image data to Java
  40.     env->CallStaticVoidMethod(mCameraJClass, fields.post_event,
  41.             mCameraJObjectWeak, msgType, 0, 0, obj);
  42.     if (obj) {
  43.         env->DeleteLocalRef(obj);
  44.     }
  45. }
以上先建立一个byte数组obj,将data缓存数据存储进obj数组,CallStaticVoidMethod是C调用java函数,最后执行实在Camera.java(框架)的postEventFromNative()
从这里开始,回调函数进入到camera framework层
frameworks/base/core/java/android/hardware/Camera.java
  1. private static void postEventFromNative(Object camera_ref,
  2.                                             int what, int arg1, int arg2, Object obj)
  3.     {
  4.         Camera c = (Camera)((WeakReference)camera_ref).get();
  5.         if (== null)
  6.             return;

  7.         if (c.mEventHandler != null) {
  8.             Message m = c.mEventHandler.obtainMessage(what, arg1, arg2, obj);
  9.             c.mEventHandler.sendMessage(m);
  10.         }
  11.     }
sendMessage之后由handle进行处理,定义同样在framework层
  1. private class EventHandler extends Handler
  2.     {
  3.         private Camera mCamera;

  4.         public EventHandler(Camera c, Looper looper) {
  5.             super(looper);
  6.             mCamera = c;
  7.         }

  8.         @Override
  9.         public void handleMessage(Message msg) {
  10.             switch(msg.what) {
  11.             case CAMERA_MSG_SHUTTER:
  12.                 if (mShutterCallback != null) {
  13.                     mShutterCallback.onShutter();
  14.                 }
  15.                 return;

  16.             case CAMERA_MSG_RAW_IMAGE:
  17.                 if (mRawImageCallback != null) {
  18.                     mRawImageCallback.onPictureTaken((byte[])msg.obj, mCamera);
  19.                 }
  20.                 return;

  21.             case CAMERA_MSG_COMPRESSED_IMAGE:
  22.                 if (mJpegCallback != null) {
  23.                     mJpegCallback.onPictureTaken((byte[])msg.obj, mCamera);
  24.                 }
  25.                 return;

  26.             case CAMERA_MSG_PREVIEW_FRAME:
  27.                 if (mPreviewCallback != null) {
  28.                     PreviewCallback cb = mPreviewCallback;
  29.                     if (mOneShot) {
  30.                         // Clear the callback variable before the callback
  31.                         // in case the app calls setPreviewCallback from
  32.                         // the callback function
  33.                         mPreviewCallback = null;
  34.                     } else if (!mWithBuffer) {
  35.                         // We're faking the camera preview mode to prevent
  36.                         // the app from being flooded with preview frames.
  37.                         // Set to oneshot mode again.
  38.                         setHasPreviewCallback(true, false);
  39.                     }
  40.                     cb.onPreviewFrame((byte[])msg.obj, mCamera);
  41.                 }
  42.                 return;

  43.             case CAMERA_MSG_POSTVIEW_FRAME:
  44.                 if (mPostviewCallback != null) {
  45.                     mPostviewCallback.onPictureTaken((byte[])msg.obj, mCamera);
  46.                 }
  47.                 return;

  48.             case CAMERA_MSG_FOCUS:
  49.                 if (mAutoFocusCallback != null) {
  50.                     mAutoFocusCallback.onAutoFocus(msg.arg1 == 0 ? false : true, mCamera);
  51.                 }
  52.                 return;

  53.             case CAMERA_MSG_ZOOM:
  54.                 if (mZoomListener != null) {
  55.                     mZoomListener.onZoomChange(msg.arg1, msg.arg2 != 0, mCamera);
  56.                 }
  57.                 return;

  58.             case CAMERA_MSG_PREVIEW_METADATA:
  59.                 if (mFaceListener != null) {
  60.                     mFaceListener.onFaceDetection((Face[])msg.obj, mCamera);
  61.                 }
  62.                 return;

  63.             case CAMERA_MSG_ERROR :
  64.                 Log.e(TAG, "Error " + msg.arg1);
  65.                 if (mErrorCallback != null) {
  66.                     mErrorCallback.onError(msg.arg1, mCamera);
  67.                 }
  68.                 return;

  69.             default:
  70.                 Log.e(TAG, "Unknown message type " + msg.what);
  71.                 return;
  72.             }
  73.         }
  74.     }
上面可以看出,这里处理了所有的回调,快门回调mShutterCallback.onShutter(),RawImageCallback.onPictureTaken()拍照数据回调,自动对焦回调等。。
默认是没有previewcallback这个回调的,除非你的app设置了setPreviewCallback,可以看出preview的数据还是可以向上层回调,只是系统默认不回调,这里再说深一些:
由上面绿色标注的地方可以看出,我们需要做以下事情,检查PreviewCallback 这个在framework中定义的接口有没有设置了setPreviewCallback,设置则调用,这里接口中
onPreviewFrame方法需要开发者自己实现,这里默认是没有实现的,需要特殊使用的要自己添加,这里是自己的理解,看一下PreviewCallback 接口的定义:frameworks/base/core/java/android/hardware/Camera.java

  1. /**
  2.      * Callback interface used to deliver copies of preview frames as
  3.      * they are displayed.
  4.      *
  5.      * @see #setPreviewCallback(Camera.PreviewCallback)
  6.      * @see #setOneShotPreviewCallback(Camera.PreviewCallback)
  7.      * @see #setPreviewCallbackWithBuffer(Camera.PreviewCallback)
  8.      * @see #startPreview()
  9.      */
  10.     public interface PreviewCallback
  11.     {
  12.         /**
  13.          * Called as preview frames are displayed. This callback is invoked
  14.          * on the event thread {@link #open(int)} was called from.
  15.          *
  16.          * @param data the contents of the preview frame in the format defined
  17.          * by {@link android.graphics.ImageFormat}, which can be queried
  18.          * with {@link android.hardware.Camera.Parameters#getPreviewFormat()}.
  19.          * If {@link android.hardware.Camera.Parameters#setPreviewFormat(int)}
  20.          * is never called, the default will be the YCbCr_420_SP
  21.          * (NV21) format.
  22.          * @param camera the Camera service object.
  23.          */
  24.         void onPreviewFrame(byte[] data, Camera camera);
  25.     };
另数据采集区与显示区两个缓存区buffer preview数据的投递,以完成preview实时显示是在HAL层完成的


到这里为止,整个过程大致走了一遍,中间必定有很多不多,这也只是自己的学习记录,难免有自己错误的见解,待修正

待续。。。。。
阅读(1161) | 评论(0) | 转发(0) |
给主人留下些什么吧!~~