[meta-freescale] [meta-fsl-arm] Using gstreamer on Nitrogen6x/SabreLite
eric.nelson at boundarydevices.com
Wed Mar 5 09:59:58 PST 2014
On 03/05/2014 10:52 AM, Gary Thomas wrote:
> On 2014-03-05 10:44, Eric Nelson wrote:
>> Hi Gary,
>> On 03/04/2014 08:03 AM, Gary Thomas wrote:
>>> I have a SabreLite with the OV5642 camera. I'd like to capture
>>> some video and display it on the screen. Here's my gstreamer
>>> gst-launch -e -vvv mfw_v4lsrc device=/dev/video0 num-buffers=100 typefind=true \
>>> ! "video/x-raw-yuv, format=(fourcc)I420, width=640, height=480, framerate=(fraction)30/1" \
>>> ! ffmpegcolorspace \
>>> ! ximagesink
>>> What I don't understand is why the format from mfw_v4lsrc has to
>>> be I420 when the OV5642 [kernel] driver seems to only support YUYV
>>> To further confuse, I can grab a frame like this:
>>> yavta -fYUYV -s640x480 -F -c1 /dev/video0
>>> and the V4l2 subsystem tells me this sensor is YUYV:
>>> root at nitrogen6x:~# v4l2-ctl -d /dev/video0 --list-formats
>>> ioctl: VIDIOC_ENUM_FMT
>>> Index : 0
>>> Type : Video Capture
>>> Pixel Format: 'YUYV'
>>> Name :
>> This appears to be a bug in the video driver (mxc_v4l2_capture) with
>> respect to enumeration (not a strong point for the drivers).
>> The driver (mxc_v4l2_capture) appears to support a wide variety of
>> pixel formats through the magic of the IPU.
>> The mfw_v4lsrc plugin appears to be hard-coded to support only
>> UYVY and I420 on i.MX6 though (and NV12 on i.MX51).
>> Note that your pipeline is very expensive, and would benefit from
>> using a sink that can support YUV natively (mfw_isink, mfw_v4lsink)
>> or that can do the conversion in hardware (glimagesink).
> Thanks. I only used ximagesink as an example (one that also
> works on the desktop) for others to see. In the end, the video
> will probably be packaged into some container format (.mp4) and/or
> streamed, so those other methods don't help much.
> Is it possible to use the v4l2src gstreamer element with this video?
> I haven't been able to make that work at all...
Not without some hacking of the kernel driver(s), and after that,
you'd need to figure out how to handle allocation of DMA'able
buffers, and so on.
If you were to undertake that, I'd recommend taking a hard look
at Carlos's GStreamer-1.0 in the process, since I understand that
the buffer-chaining process is different in 1.0 (and cleaner in 1.0).
More information about the meta-freescale