UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

cancel
Showing results for 
Search instead for 
Did you mean: 
Explorer
Explorer
224 Views
Registered: ‎07-06-2016

Gstreamer and sending data from PS to PL

Hi,

I've currently got working a project using gstreamer for video encoding using the VCU. It takes video frames from a USB camera, encode them and then save it to a file.  

The camera interface is done at the PS side and the frames are pushed to the gstreamer pipeline.

Now I need to do some frame processing in the PL side before pushing the frames into the pipeline, is there a gstreamer element that allows this? 

What would be the best and more efficient way to do this?

Any ideas/help would be appreciated.

Thanks in advance

Tags (1)
0 Kudos
4 Replies
Moderator
Moderator
151 Views
Registered: ‎11-09-2015

Re: Gstreamer and sending data from PS to PL

Hi @joseer 

It depends on what you are trying to do. If you processing is only scaling using the Video Processing subsystem, then yes, you will be able to use Gstreamer because the VPSS as a V4L2 driver which can be called using Gstreamer.

But if you are using your custom IP, it will work only if you have the proper drivers.

Regards


Florent
Product Application Engineer - Xilinx Technical Support EMEA
**~ Don't forget to reply, give kudos, and accept as solution.~**
0 Kudos
Explorer
Explorer
140 Views
Registered: ‎07-06-2016

Re: Gstreamer and sending data from PS to PL

Hi @florentw ,

Thanks for your answer,

What I'm trying to do is to convert raw bayer frames to RGB using the Sensor Demosaic IP, then convert them to NV12 and finally feed them to the VCU for encoding.

The data (frames) are received though USB and buffered to memory in the PS domain, I'm able to push the frames to a gstreamer pipeline using the appsrc element, encode and save them in a file, but obviously the resulting video color format is wrong, so I need to do a bayer interpolation and conversion to get the correct NV12 format supported by the VCU.

I could use the PS and video convert element to process and prepare the frames but that is quite processing intensive for the CPU so I need to use the PL in order to try to keep as much as possible the frame rate.

So in summary what I'm trying to do is (In red is the step I need to implement) :

USB frames --> memory (PS)---> push to Gstreamer pipeline (PS)  --> Bayer demosaic&NV12 conversion (PL) -> back to Gstreamer pipeline (PS) 

Or:

USB frames --> memory (PS)--->Bayer demosaic&NV12 conversion (PL)  -->memory (PS) --> push to Gstreamer pipeline (PS)   

I saw the mem2mem framework as a potential solution but unfortunatelly it doesn't like that supports Y8/GRAY8 or similar format needed to send the raw data to the PL....

Do you know what would be the best way or ways to do this?

0 Kudos
Moderator
Moderator
66 Views
Registered: ‎11-09-2015

Re: Gstreamer and sending data from PS to PL

HI @joseer 

I am not sure if that would work but what if you read the data as YUYV8. You will get the data as if it was 2 pixel per clock (just assume the chroma sample is actually a Y) and then you use an AXI4-Stream data width converter to move back to one pixel per clock.

This is just a idea, I have no idea if this will integrate well in the framework.

Let me know if you find a solution (or if you try and it work), I would be interested

Regards


Florent
Product Application Engineer - Xilinx Technical Support EMEA
**~ Don't forget to reply, give kudos, and accept as solution.~**
0 Kudos
Explorer
Explorer
58 Views
Registered: ‎07-06-2016

Re: Gstreamer and sending data from PS to PL

Hi @florentw ,

Thanks for the reply and suggestion.

In summary what I'm seeing are two potential solutions/options, please fell free to add more alternatives (if there're or anyone know), or correct this ones:

- Frame conversion within the gstreamer pipeline: setup the mem2mem driver framework for a passthrough data but insert a axi4-Stream data width converter IP with the sensor demosaic IP, so the PL will look like:

frame-read-IP(YUYV8)-->axi4-Stream converter -> Sensor demosaic IP(RGB) ->frame-write-IP

- Frame conversion before to push to the gstreamer pipeline: setup a video pipeline (will be the same as above) and use the v4l2 driver to send the raw frames memory 2 memory (PS->PL->PS).

The first one seems the easiest but not sure if it would work, I'll try to post the results...