Timing constraints for spi slave interface on FPGA.
I have a ADC acting as SPI master sourcing clk, data, and sync, with clock running at 40MHz. (25ns period).
I'm using the positive edge of the spi clock to sample the data and sync signals, so I don't think I have to worry about the possible variation in duty cycle as seen in the attached image. SpiClk comes into dedicated global clk pins.
Per attached diagram from the A2D spec, the sync signal transitions as little as 3ns before the SpiClk, and the data only 4ns before the spiclk.
All three signals are inputs to the FPGA. So my question is what should the timing constraints look like.
Seems like the sync looks like it was sourced by a previous SpiClk rising edge and then delayed by 22ns. (same for spidata, only input delay would be 21ns).
Do the below constraints properly constrain this?
I'd also need a an input_delay -min. What would this be set to?
I don't really understand where the 0 phase version of the clk is defined by the create_clock statement since this is an external applied clk.
Or how the timing analyzer does its thing, since what is really important is the difference in insertion delay of SpiData/Sync vs. SpiClk.