Re: imxipuvideosink in 3.10.53 on Nitrogex6xlite


Gary Thomas <samoht.yrag@...>
 

On 2015-05-18 17:52, Nikolay Dimitrov wrote:
Ho Gary,

On 05/18/2015 07:27 PM, Gary Thomas wrote:
On 2015-05-18 09:55, Nikolay Dimitrov wrote:
Hi Gary,

On 05/18/2015 03:04 PM, Gary Thomas wrote:
On 2015-05-18 02:18, Nikolay Dimitrov wrote:
Hi Pawel,

On 05/18/2015 08:34 AM, Paweł Żabiełowicz wrote:
Hi all,

I'm having some problems running video playback on Nitrogen6x-Lite.
I'm
using fido with 3.10.53 kernel. Display is running properly as I see a
console after start, but starting any simple video with Gstreamer1.0 +
gstreamer-imx plugins does not give any video output, even though the
decoding looks like it's working.

Gstreamer log:
gst-launch-1.0 rtspsrc location=rtsp://watch:watch13579
192.168.7.24:554/profile3/media.smp ! rtph264depay ! imxvpudec !
imxipuvideosink
[INFO] Product Info: i.MX6Q/D/S
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to
rtsp://192.168.7.24:554/profile3/media.smp
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:15.125067669
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Plugins installed:

gst-inspect-1.0 | grep imx
imxipu: imxipuvideotransform: Freescale IPU video transform
imxipu: imxipuvideosink: Freescale IPU video sink
imxvpu: imxvpudec: Freescale VPU video decoder
imxvpu: imxvpuenc_h263: Freescale VPU h.263 video encoder
imxvpu: imxvpuenc_h264: Freescale VPU h.264 video encoder
imxvpu: imxvpuenc_mpeg4: Freescale VPU MPEG-4 video encoder
imxvpu: imxvpuenc_mjpeg: Freescale VPU motion JPEG video encoder
imxg2d: imxg2dvideosink: Freescale G2D video sink
imxg2d: imxg2dvideotransform: Freescale G2D video transform
imxaudio: imxuniaudiodec: Freescale i.MX uniaudio decoder
imxv4l2src: imxv4l2src: V4L2 CSI Video Source
imxpxp: imxpxpvideosink: Freescale PxP video sink
imxpxp: imxpxpvideotransform: Freescale PxP video transform
imxeglvivsink: imxeglvivsink: Freescale EGL video sink

While using imxeglvivsink I'm seeing red flash for few milliseconds
over
the console and that's it.
Any advices will be helpful.
Please download this file and try a local file playback, to eliminate
possible networking issues:

https://download.blender.org/durian/trailer/sintel_trailer-1080p.mp4

Also, please check whether an automatically constructed pipeline is
able to play the file above:

gst-launch-1.0 playbin uri=file:///sintel_trailer-1080p.mp4

You can try same tests with Xorg running.
What would that pipeline look like (to force the result into a window
on the screen)? As is, this simple command takes over the entire
screen.
The automatically generated pipeline depends entirely on the media
itself and the set of plugins you have installed - playbin/decodebin
and friends are looking at the stream properties and searching for the
plugin with the highest rank which can be plugged there. The best place
to look at is usually the media graphs (.dot files) after playbin was
able to play something (I do this regularly, as my pipelines are
usually worse than the ones created by playbin).
I tried interpreting these files and since I don't do it all the time
(like you)it was hard for me to see the whole picture. Is it possible
to look at such a graph and write something equivalent that could be
run via gst-launch?

Here's my graph in case you can make any sense of it.
Not sure that the following will help much, but at least I can share
what I know, so we can be in the same boat...

Please look at the demuxer (qtdemux0), notice that the lower 2 source
pads are denoted with dashed lines, which means these pads are dynamic,
e.g. doesn't exist all the time (there are more components with dynamic
pads in this pipeline).

What happens is that after playback start, the demuxer only consumes
data for a while and doesn't have any output pads. When the demuxer
understands what logical streams are carried over the transport, it
dynamically creates a number of source pads, one for each supported
content type (usually for video/audio/subs, but can be more).

When the demuxer's dynamic pads are created, the pipeline emits events,
and the parent element (decodebin0) catches these events, looks for
appropriate decoding elements, and attaches them to the pipeline
dynamically.

The trouble with these dynamic pipelines is that you can't refer to the
source/sink pads of the elements before the pipeline is created, which
means you can't construct a working dynamic pipeline from the command
line. I looked how to do this, but couldn't find a solution. If someone
knows how to do this, would be great to share.

So the only way I know to create fully custom dynamic pipelines (e.g.
driven by the content) is via source code, otherwise we have to use
automated bins like playbin/decodebin, and give them some parameters.
Thanks for the explanation, perhaps it can help someone fix this. My
guess is that the FSL plugin doesn't handle those dynamic elements and
thus is not equipped to set up the render in the appropriate window on
the screen.




Also the full-screen behavior depends the videosink configuration, so
hard to give universal answer, as none will fit all cases.
Regards,
Nikolay

Join meta-freescale@lists.yoctoproject.org to automatically receive all group messages.