cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
drags
Visitor
Visitor
682 Views
Registered: ‎04-13-2019

Designing timing constraints for serial ADC, LVDS SDR

I am working on a project that uses a Zynq 7 SoC to interface with a 16 bit LVDS ADC (AD7961), and I'm trying to make sure that all timing requirements of the ADC are met.

The ADC requires 2 LVDS inputs: CLK± and CNV±, and 2 LVDS outputs: DCO±, and D±. CLK± is a gated clock, and is created by connecting a 125Mhz clock (created from the Zynq chip) to a ODDR, and then to a OBUFDS. After 16 bursts of CLK± I have to keep it low until the next sample is available. An echcoed data clock is then sent to the interface, and D± changes every negative edge of that clock. In my design, I would use the data clock on every positive edge to capture the data, which is then put through a shift register until all 16 bits of the sample are acquired. Those 16 bits are then put through a independent clock FIFO generator IP to synchronize the sample data back to the clock driving the interface, so that I can begin the process of transferring the data to the Zynq using a DMA elsewhere in the FPGA.

Here is a picture of my interface

firefox_2019-04-13_23-33-45.png

And here is a picture from the ADC data sheet showing an example of sample acquisition

firefox_2019-04-13_23-36-58.png

 

The data sheet says there is a delay between CLK± to DCO± of 5ns in the worst case, and a 1ns worst case delay of DCO± to D±. I have read other threads on this forum regarding timing constraints with ADCs, and from what I understand if the data clock is used to capture the incoming data, then the relationship between the clock sent to the ADC and the data clock is mesochronous; the frequency of the two clocks is the same, but the skew is unknown. In that case, I have read that one way of handling this situation is to use a create_clock constraint to define the data clock, and set the known delay between the data clock and the incoming data. So something like

create_clock -name data_clk -period 8.0 [get_ports DCO_Clk]

set_input_delay -clock data_clk -max 1.0 [get_ports d_single]

set_input_delay -clock data_clk -min -5 [get_ports d_single]

(My thought is that the data is valid worst case 1ns after the clock and remains valid until 5ns after the clock)

 

Is this an appropriate design?

 

0 Kudos
Reply
0 Replies