Pad
Pad
is requested.Pad
is added to an Element.Pad
.Pad.link(Pad)
and friends.Pad
will become available.Pad
is removed from an Element.Pad
or an
ElementFactory
can handle.Pipeline
is a special Bin
used as the toplevel container for
the filter graph.Pipeline
MediaPlayer.setURI(java.net.URI)
.It can handle both audio and video files and features
A playbin element can be created just like any other element using
ElementFactory.make(java.lang.String, java.lang.String)
, although to call PlayBin specific methods, it
is best to create one via a PlayBin.PlayBin(String)
or PlayBin.PlayBin(String, URI)
constructor.
The file/URI to play should be set via PlayBin.setInputFile(java.io.File)
or PlayBin.setURI(java.net.URI)
Playbin is a Pipeline
. It will notify the application of everything
that's happening (errors, end of stream, tags found, state changes, etc.)
by posting messages on its Bus
. The application needs to watch the
bus.
Playback can be initiated by setting the PlayBin to PLAYING state using
setState
or play
. Note that the state change will take place in
the background in a separate thread, when the function returns playback
is probably not happening yet and any errors might not have occured yet.
Applications using playbin should ideally be written to deal with things
completely asynchroneous.
When playback has finished (an EOS message has been received on the bus) or an error has occured (an ERROR message has been received on the bus) or the user wants to play a different track, playbin should be set back to READY or NULL state, then the input file/URI should be set to the new location and then playbin be set to PLAYING state again.
Seeking can be done using seek
on the playbin element.
Again, the seek will not be executed instantaneously, but will be done in a
background thread. When the seek call returns the seek will most likely still
be in process. An application may wait for the seek to finish (or fail) using
Element.getState(long)
with -1 as the timeout, but this will block the user
interface and is not recommended at all.
Applications may query the current position and duration of the stream
via Pipeline.queryPosition()
and Pipeline.queryDuration()
and
setting the format passed to Format.TIME
. If the query was successful,
the duration or position will have been returned in units of nanoseconds.
By default, if no audio sink or video sink has been specified via PlayBin.setAudioSink(org.gstreamer.Element)
and PlayBin.setVideoSink(org.gstreamer.Element)
, playbin will use the autoaudiosink and autovideosink
elements to find the first-best available output method.
This should work in most cases, but is not always desirable. Often either
the user or application might want to specify more explicitly what to use
for audio and video output.
If the application wants more control over how audio or video should be
output, it may create the audio/video sink elements itself (for example
using ElementFactory.make(java.lang.String, java.lang.String)
) and provide them to playbin using PlayBin.setAudioSink(org.gstreamer.Element)
and PlayBin.setVideoSink(org.gstreamer.Element)
GNOME-based applications, for example, will usually want to create gconfaudiosink and gconfvideosink elements and make playbin use those, so that output happens to whatever the user has configured in the GNOME Multimedia System Selector confinguration dialog.
The sink elements do not necessarily need to be ready-made sinks. It is
possible to create container elements that look like a sink to playbin,
but in reality contain a number of custom elements linked together. This
can be achieved by creating a Bin
and putting elements in there and
linking them, and then creating a sink GhostPad
for the bin and pointing
it to the sink pad of the first element within the bin. This can be used
for a number of purposes, for example to force output to a particular
format or to modify or observe the data before it is output.
It is also possible to 'suppress' audio and/or video output by using 'fakesink' elements (or capture it from there using the fakesink element's "handoff" signal, which, nota bene, is fired from the streaming thread!).
Most of the common meta data (artist, title, etc.) can be retrieved by watching for TAG messages on the pipeline's bus (see above).
Other more specific meta information like width/height/framerate of video streams or samplerate/number of channels of audio streams can be obtained using the "stream-info" property, which will return a GList of stream info objects, one for each stream. These are opaque objects that can only be accessed via the standard GObject property interface, ie. g_object_get(). Each stream info object has the following properties:
Stream information from the stream-info properties is best queried once playbin has changed into PAUSED or PLAYING state (which can be detected via a state-changed message on the bus where old_state=READY and new_state=PAUSED), since before that the list might not be complete yet or not contain all available information (like language-codes). >
Playbin handles buffering automatically for the most part, but applications need to handle parts of the buffering process as well. Whenever playbin is buffering, it will post BUFFERING messages on the bus with a percentage value that shows the progress of the buffering process. Applications need to set playbin to PLAYING or PAUSED state in response to these messages. They may also want to convey the buffering progress to the user in some way. Here is how to extract the percentage information from the message (requires GStreamer >= 0.10.11):
PlayBin playbin = new PlayBin("player"); playbin.getBus().connect(new Bus.BUFFERING() { public void bufferingMessage(GstObject element, int percent) { System.out.printf("Buffering (%u percent done)\n", percent); } }Note that applications should keep/set the pipeline in the PAUSED state when a BUFFERING message is received with a buffer percent value < 100 and set the pipeline back to PLAYING state when a BUFFERING message with a value of 100 percent is received (if PLAYING is the desired state, that is). >
By default, playbin (or rather the video sinks used) will create their own window. Applications will usually want to force output to a window of their own, however. This can be done using the GstXOverlay interface, which most video sinks implement. See the documentation there for more details.
The device to use for CDs/DVDs needs to be set on the source element playbin creates before it is opened. The only way to do this at the moment is to connect to playbin's "notify::source" signal, which will be emitted by playbin when it has created the source element for a particular URI. In the signal callback you can check if the source element has a "device" property and set it appropriately. In future ways might be added to specify the device as part of the URI, but at the time of writing this is not possible yet.
Here is a simple pipeline to play back a video or audio file:
gst-launch -v playbin uri=file:///path/to/somefile.avi
This will play back the given AVI video file, given that the video and audio decoders required to decode the content are installed. Since no special audio sink or video sink is supplied (not possible via gst-launch), playbin will try to find a suitable audio and video sink automatically using the autoaudiosink and autovideosink elements.
Here is a another pipeline to play track 4 of an audio CD:
gst-launch -v playbin uri=cdda://4
This will play back track 4 on an audio CD in your disc drive (assuming the drive is detected automatically by the plugin).
Here is a another pipeline to play title 1 of a DVD:
gst-launch -v playbin uri=dvd://1
This will play back title 1 of a DVD in your disc drive (assuming
the drive is detected automatically by the plugin).
Plugin
This is a base class for anything that can be added to a Plugin.Message
on this Bus.Buffer
from the AppSink.