Designing timing constraints for serial ADC, LVDS SDR
I am working on a project that uses a Zynq 7 SoC to interface with a 16 bit LVDS ADC (AD7961), and I'm trying to make sure that all timing requirements of the ADC are met.
The ADC requires 2 LVDS inputs: CLK± and CNV±, and 2 LVDS outputs: DCO±, and D±. CLK± is a gated clock, and is created by connecting a 125Mhz clock (created from the Zynq chip) to a ODDR, and then to a OBUFDS. After 16 bursts of CLK± I have to keep it low until the next sample is available. An echcoed data clock is then sent to the interface, and D± changes every negative edge of that clock. In my design, I would use the data clock on every positive edge to capture the data, which is then put through a shift register until all 16 bits of the sample are acquired. Those 16 bits are then put through a independent clock FIFO generator IP to synchronize the sample data back to the clock driving the interface, so that I can begin the process of transferring the data to the Zynq using a DMA elsewhere in the FPGA.
Here is a picture of my interface
And here is a picture from the ADC data sheet showing an example of sample acquisition
The data sheet says there is a delay between CLK± to DCO± of 5ns in the worst case, and a 1ns worst case delay of DCO± to D±. I have read other threads on this forum regarding timing constraints with ADCs, and from what I understand if the data clock is used to capture the incoming data, then the relationship between the clock sent to the ADC and the data clock is mesochronous; the frequency of the two clocks is the same, but the skew is unknown. In that case, I have read that one way of handling this situation is to use a create_clock constraint to define the data clock, and set the known delay between the data clock and the incoming data. So something like