AwesomePlayer::onVideoEvent除了透過OMXCodec::read取得解碼後的資料外,還必須將這些資料(mVideoBuffer)傳給video renderer,以便畫到螢幕上去。
(1) 要將mVideoBuffer中的資料畫出來之前,必須先建立mVideoRenderer
void AwesomePlayer::onVideoEvent() { ...
if (mVideoRenderer == NULL) { initRenderer_l(); }
... }
void AwesomePlayer::initRenderer_l() { if (!strncmp("OMX.", component, 4)) { mVideoRenderer = new AwesomeRemoteRenderer( mClient.interface()->createRenderer( mISurface, component, ...)); .......... (2) } else { mVideoRenderer = new AwesomeLocalRenderer( ..., component, mISurface); ............................ (3) } }
|
(2) 如果video decoder是OMX component,則建立一個AwesomeRemoteRenderer作為mVideoRenderer
從上段的程式碼(1)來看,AwesomeRemoteRenderer的本質是由OMX::createRenderer所創建的。createRenderer會先建立一個hardware renderer -- SharedVideoRenderer (libstagefrighthw.so);若失敗,則建立software renderer -- SoftwareRenderer (surface)。
sp<IOMXRenderer> OMX::createRenderer(...) { VideoRenderer *impl = NULL;
libHandle = dlopen("libstagefrighthw.so", RTLD_NOW);
if (libHandle) { CreateRendererFunc func = dlsym(libHandle, ...);
impl = (*func)(...); <----------------- Hardware Renderer }
if (!impl) { impl = new SoftwareRenderer(...); <---- Software Renderer } }
|
(3) 如果video decoder是software component,則建立一個AwesomeLocalRenderer作為mVideoRenderer
AwesomeLocalRenderer的constructor會呼叫本身的init函式,其所做的事和OMX::createRenderer一模一樣。
void AwesomeLocalRenderer::init(...) { mLibHandle = dlopen("libstagefrighthw.so", RTLD_NOW);
if (mLibHandle) { CreateRendererFunc func = dlsym(...);
mTarget = (*func)(...); <---------------- Hardware Renderer }
if (mTarget == NULL) { mTarget = new SoftwareRenderer(...); <--- Software Renderer } }
|
(4) mVideoRenderer一經建立就可以開始將解碼後的資料傳給它
void AwesomePlayer::onVideoEvent() { if (!mVideoBuffer) { mVideoSource->read(&mVideoBuffer, ...); }
[Check Timestamp]
if (mVideoRenderer == NULL) { initRenderer_l(); }
mVideoRenderer->render(mVideoBuffer); <----- Render Data }
|
阅读(1686) | 评论(0) | 转发(1) |