cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
jhandrews
Visitor
Visitor
467 Views
Registered: ‎11-08-2018

Timing constraints for spi slave interface on FPGA.

I have a ADC acting as SPI master sourcing clk, data, and sync, with clock running at 40MHz. (25ns period). 

I'm using the positive edge of the spi clock to sample the data and sync signals, so I don't think I have to worry about the possible variation in duty cycle as seen in the attached image.  SpiClk comes into dedicated global clk pins.

Per attached diagram from the A2D spec, the sync signal transitions as little as 3ns before the SpiClk, and the data only 4ns before the spiclk.

All three signals are inputs to the FPGA.  So my question is what should the timing constraints look like.

Seems like the sync looks like it was sourced by a previous SpiClk rising edge and then delayed by 22ns.  (same for spidata, only input delay would be 21ns).

Do the below constraints properly constrain this?

I'd also need a an input_delay -min.  What would this be set to?

I don't really understand where the 0 phase version of the clk is defined by the create_clock statement since this is an external applied clk.

Or how the timing analyzer does its thing, since what is really important is the difference in insertion delay of SpiData/Sync vs. SpiClk. 

------

create_clock -period 25.000 -name SpiClk -waveform {0.000 12.500} [get_ports {A2D_Gck_0[0]}]

set_input_delay -clock [get_clocks SpiClk] -max 22 [get_ports SpiSync]

set_input_delay -clock [get_clocks SpiClk] -max 22 [get_ports SpiData]

Thanks for your help!

0 Kudos
Reply
1 Reply
jhandrews
Visitor
Visitor
465 Views
Registered: ‎11-08-2018

Attached timing diagram again, since it didn't appear initially.timing.png

0 Kudos
Reply