全部博文(174)
分类: LINUX
2009-05-09 19:50:46
|
g_timeout_add (200, (GSourceFunc) cb_print_position, pipeline);
是定期发出查询。
Events: seeking (and more)
Events work in a very similar way as queries. Dispatching, for example, works exactly the same for
events (and also has the same limitations), and they can similarly be sent to the toplevel pipeline and it
will figure out everything for you. Although there are more ways in which applications and elements can
interact using events, we will only focus on seeking here. This is done using the seek-event. A seek-event
contains a playback rate, a seek offset format (which is the unit of the offsets to follow, e.g. time, audio
samples, video frames or bytes), optionally a set of seeking-related flags (e.g. whether internal buffers
should be flushed), a seek method (which indicates relative to what the offset was given), and seek
offsets. The first offset (cur) is the new position to seek to, while the second offset (stop) is optional and
specifies a position where streaming is supposed to stop. Usually it is fine to just specify
GST_SEEK_TYPE_NONE and -1 as end_method and end offset. The behaviour of a seek is also
wrapped in the gst_element_seek ().
事件的工作方式非常像查询,例如派遣机制的工作方式是一样的。虽然有很多的方式供应用程序和元素使用事件交互,我们只关注定位这种方式。它使用定位事件。
|
GST_SEEK_FLAG_FLUSH标识的影响:
Metadata
GStreamer makes a clear distinction between two types of metadata, and has support for both types. The
first is stream tags, which describe the content of a stream in a non-technical way. Examples include the
author of a song, the title of that very same song or the album it is a part of. The other type of metadata is
stream-info, which is a somewhat technical description of the properties of a stream. This can include
video size, audio samplerate, codecs used and so on. Tags are handled using the GStreamer tagging
system. Stream-info can be retrieved from a GstPad.
GStreamer区别两种元数据:
1.流标签。关于流的内容层次。例如歌手,题目等。
2.流信息。关于流的技术层次。例如采样率,编码技术等。
流标签由GStreamer的标签机制处理,而流信息可以从GstPad获得。
Metadata reading
Stream information can most easily be read by reading them from a GstPad. This has already been
discussed before in Section 8.3.1. Therefore, we will skip it here. Note that this requires access to all
pads of which you want stream information.
Tag reading is done through a bus in GStreamer, which has been discussed previously in Chapter 7. You
can listen for GST_MESSAGE_TAG messages and handle them as you wish.
Interfaces
In Section 5.3, you have learned how to use GObject properties as a simple way to do interaction
between applications and elements. This method suffices for the simple’n’straight settings, but fails for
anything more complicated than a getter and setter. For the more complicated use cases, GStreamer uses
interfaces based on the Glib GInterface type.
Most of the interfaces handled here will not contain any example code. See the API references for
details. Here, we will just describe the scope and purpose of each interface.
涉及GObject属性的概念只能提供set和get这样简单的接口。GStreamer提供了更多的基于Glib的GInterface提供了丰富的接口。
*The URI interface
In all examples so far, we have only supported local files through the “filesrc” element. GStreamer,
obviously, supports many more location sources. However, we don’t want applications to need to know
any particular element implementation details, such as element names for particular network source types
and so on. Therefore, there is a URI interface, which can be used to get the source element that supports a
particular URI type. There is no strict rule for URI naming, but in general we follow naming conventions
that others use, too. For example, assuming you have the correct plugins installed, GStreamer supports
“file:///
In order to get the source or sink element supporting a particular URI, use
gst_element_make_from_uri (), with the URI type being either GST_URI_SRC for a source
element, or GST_URI_SINK for a sink element.
*The Mixer interface
The mixer interface provides a uniform way to control the volume on a hardware (or software) mixer.
The interface is primarily intended to be implemented by elements for audio inputs and outputs that talk
directly to the hardware (e.g. OSS or ALSA plugins).
Using this interface, it is possible to control a list of tracks (such as Line-in, Microphone, etc.) from a
mixer element. They can be muted, their volume can be changed and, for input tracks, their record flag
can be set as well.
Example plugins implementing this interface include the OSS elements (osssrc, osssink, ossmixer) and
the ALSA plugins (alsasrc, alsasink and alsamixer).
*The Tuner interface
The tuner interface is a uniform way to control inputs and outputs on a multi-input selection device. This
is primarily used for input selection on elements for TV- and capture-cards.
Using this interface, it is possible to select one track from a list of tracks supported by that tuner-element.
The tuner will than select that track for media-processing internally. This can, for example, be used to
switch inputs on a TV-card (e.g. from Composite to S-video).
This interface is currently only implemented by the Video4linux and Video4linux2 elements.
*The Color Balance interface
The colorbalance interface is a way to control video-related properties on an element, such as brightness,
contrast and so on. It’s sole reason for existance is that, as far as its authors know, there’s no way to
dynamically register properties using GObject.
The colorbalance interface is implemented by several plugins, including xvimagesink and the
Video4linux and Video4linux2 elements.
*The Property Probe interface
The property probe is a way to autodetect allowed values for a GObject property. It’s primary use is to
autodetect devices in several elements. For example, the OSS elements use this interface to detect all
OSS devices on a system. Applications can then “probe” this property and get a list of detected devices.
Note: Given the overlap between HAL and the practical implementations of this interface, this might
in time be deprecated in favour of HAL.
This interface is currently implemented by many elements, including the ALSA, OSS, XVImageSink,
Video4linux and Video4linux2 elements.
*The X Overlay interface
The X Overlay interface was created to solve the problem of embedding video streams in an application
window. The application provides an X-window to the element implementing this interface to draw on,
52
Chapter 13. Interfaces
and the element will then use this X-window to draw on rather than creating a new toplevel window. This
is useful to embed video in video players.
This interface is implemented by, amongst others, the Video4linux and Video4linux2 elements and by
ximagesink, xvimagesink and sdlvideosink.
Clocks in GStreamer
To maintain sync in pipeline playback (which is the only case where this really matters), GStreamer uses
clocks. Clocks are exposed by some elements, whereas other elements are merely clock slaves. The
primary task of a clock is to represent the time progress according to the element exposing the clock,
based on its own playback rate. If no clock provider is available in a pipeline, the system clock is used
instead.
为了维护同步和回放,GStreamer使用时钟。有几种不同的时钟,他们单独的当前值是没有意义的,他们的差值才有意义。
|
Clock providers
Clock providers exist because they play back media at some rate, and this rate is not necessarily the same
as the system clock rate. For example, a soundcard may playback at 44,1 kHz, but that doesn’t mean that
after exactly 1 second according to the system clock, the soundcard has played back 44.100 samples.
This is only true by approximation. Therefore, generally, pipelines with an audio output use the
audiosink as clock provider. This ensures that one second of video will be played back at the same rate as
that the soundcard plays back 1 second of audio.
时钟提供者保持同一速率播放媒体,但是这个速率不一定和系统时钟一致。声卡可能工作在44.1KHz的速率,但是并不精确。管道使用audiosink作为时钟提供者,保证一秒的视频和一秒的音频同步。
Clock slaves
Clock slaves get assigned a clock by their containing pipeline. Their task is to make sure that media
playback follows the time progress as represented by this clock as closely as possible. For most
elements, that will simply mean to wait until a certain time is reached before playing back their current
sample; this can be done with the function gst_clock_id_wait (). Some elements may need to
support dropping samples too, however.
时钟仆从由管道获得一个指派的时钟。它们会等待某个特定的时刻播放当前帧。某些元素还需要支持帧丢弃。
Dynamic Controllable Parameters
Getting Started
The controller subsystem offers a lightweight way to adjust gobject properties over stream-time. It works
by using time-stamped value pairs that are queued for element-properties. At run-time the elements
continously pull values changes for the current stream-time.
控制器机制提供了轻松的方法在运行期间调整gobject的属性。它为元素属性维持附带时间戳的值对(域名+值)。在运行期间,元素会持续提取变化的值。
Setting up parameter control
|
Pipeline manipulation
This chapter will discuss how you can manipulate your pipeline in several ways from your application
on. Parts of this chapter are downright hackish, so be assured that you’ll need some programming
knowledge before you start reading this.
Topics that will be discussed here include how you can insert data into a pipeline from your application,
how to read data from a pipeline, how to manipulate the pipeline’s speed, length, starting point and how
to listen to a pipeline’s data processing.