cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Highlighted
Visitor
Visitor
432 Views
Registered: ‎09-26-2018

Real time video out from IR sensor

Hey everyone, this is my first post so please forgive me if I mess something up. 

I am trying to interface with a IR sensor(640x480) and display the raw data(for now) via HDMI out. 

For now i am just simulating on the FPGA(ARTIX 7) what the actual hardware does, the pixel data is just a grayscale gradient a cross the width but I am replicating the psync, hsync and vsync of the actual sensor. PSYNC for the sensor is 19.2MHZ @ 60Hz (HSYNC is 25 psyncs and VSYNC is 689 psyncs) 

I'm struggling with displaying it over HDMI 640x480@60hz because the output pixel clock is 25MHz.

The main issue is that psync for the sensor is 19.2MHZ and display clock is 25MHZ.

Is there any way I can achieve this without any significant delay? I don't want to buffer the whole frame which would result in approx 16ms(1/60) delay. It's just video stream in and out.

I have tried buffering various lines in BRAM of input data and then starting the output stream when the buffer is full but the data on the display doesn't seem to be stable (meaning there is a non-static misalignment between one horizontal line to the other).

I have been trying to get this to work for 3 days now. Any help/idea would be appreciated.

Please let me know if I'm missing some information here. This is the first time I am diving into images/video and FPGA. 

 

EDIT 1: The input frame time from the sensor is (665*480 + 664)/19.2M = 16.659 ms

And the display output frame time is 800*525/25M = 16.8 ms

So the input frame is always faster in this case and hence causing the issue where the buffer between the 2 doesn't remain constant.

This must be a common issue and there must be solution for this but it's just that this is my first project in images and I don't know where to look.

 

0 Kudos
8 Replies
Highlighted
Visitor
Visitor
415 Views
Registered: ‎04-15-2019

Just out of curiosity.. What FPGA board and Thermal Sensor are you using?

0 Kudos
Highlighted
Teacher
Teacher
376 Views
Registered: ‎06-16-2013

Hi @rituj_b 

 

Why don't you use up-scaler to change pixel clock to higher ?

 

Best regards,

0 Kudos
Highlighted
Advisor
Advisor
362 Views
Registered: ‎04-26-2015

@rituj_b It looks like you need to buffer a specific part of a frame.

Let's say HDMI lines take 25.6us (640/25MHz) and presumably IR sensor lines take 35.9us (689/19.2MHz). Then a whole frame sent to HDMI takes 25.6*480 = 12.3ms and a whole frame received from the sensor takes 17.2ms. What you need to buffer is the "missing" 5.1ms of data, which corresponds to 143 sensor lines. Then the process is:

- Fill the buffer (takes 5.1ms)

- Start your HDMI transmit from the buffer (takes 3.67ms to drain the buffer), refilling the buffer from the sensor as you go. The buffer slowly drains anyway because it's being emptied faster than it's being filled. Reading the full 480 lines takes 12.3ms, and reading the remaining 337 sensor lines should also take 12.3ms, so by the time the buffer is actually empty you should have reached the end of the frame.

That gives you the minimum real-world latency, since the last line sent is going to essentially go straight from the sensor to the buffer to the HDMI output with only a couple of cycles of delay.

0 Kudos
Highlighted
Visitor
Visitor
323 Views
Registered: ‎09-26-2018

Hey @u4223374 

Thanks for the idea. However the sensor will take 665/19.2 us for a line to come in and its 480 lines plus 664 pixels as vsync. 

We also have to take into account the syncing signals for the vga HDMI output, the total frame sent is 800x525

So, once a frame has ended, the system should be at the exact same state that it was when we were starting the previous frame (meaning if we start the output when the input has filled 143 lines for example, once this frame has ended we should again end up in this starting position) but I'm unable to achieve this because the frame times are not same(HDMI frame time is smaller than sensor) for both of these and with each frame, the difference keeps decreasing from 143 lines, effectively decreasing the size of the buffer. 

 

0 Kudos
Highlighted
Visitor
Visitor
321 Views
Registered: ‎09-26-2018

Hi @watari

Are you referring to the xilinx multi-scaler?

I looked at it and I'm not sure how to incorporate it in my code even after going through the guide(never used axi interface). Do you have any github references that are using it so I can get a better idea on how to use it?

0 Kudos
Highlighted
Visitor
Visitor
222 Views
Registered: ‎09-26-2018

The input frame time from the sensor is (665*480 + 664)/19.2M = 16.659 ms

And the display output frame time is 800*525/25M = 16.8 ms

So the input frame is always faster in this case and hence causing the issue where the buffer between the 2 doesn't remain constant.

This must be a common issue and there must be solution for this but it's just that this is my first project in images and I don't know where to look.

0 Kudos
Highlighted
Explorer
Explorer
161 Views
Registered: ‎07-18-2011

@rituj_b 

The bottom line is this:  if your input and output frame rates are different,  you either have to add a frame synchronizer using DDR or other frame memory, or find some way to synchronize the input frame rate to the output frame rate, or synchronize the output frame rate to the input frame rate, and use line buffers to absorb the short-term line differences.

Some IR cameras have a frame sync input or triggered mode of operation, allowing you to trigger the start of the video frame read based on your output vertical sync pulse.  As long as you have enough fifo memory, you can handle the clock/line rate differences.

If the camera can accept an external clock, you may be able to run it at the output clock rate, or at some rate locked to the output clock, to assist in matching the input and output frame rates.  

If the input and output clock/line/frame rates are fixed and cannot be changed, you will have to add some frame memory and synchronize, using a MIG and VDMA or Frame Buffer Write/Read IP.

 

Highlighted
Teacher
Teacher
118 Views
Registered: ‎06-16-2013

Hi @rituj_b 

 

Do you use native video interface ?

If yes, would you use scaler ip with "Video in to AXI4Stream" and "AXI4Stream to Video out" ?

 

Best regards,

0 Kudos