cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
729 Views
Registered: ‎09-16-2019

Calculation of output delay constraint min and max

Jump to solution

Hello, 

I have a DAC that is interfaced with an FPGA. 

We are running the DAC at 200 MHz. 

The timing specifications for the DAC (we are using Dual-Bus mode):

DACtiming.JPG

I believe I need to use the constraint:

create_clock -period 5.000 [get_ports "clock name"]

set_output_delay -clock [get_clocks "clock name"] -max  [get_ports{"data name"}]

set_output_delay -clock [get_clocks "clock name"] -min  [get_ports{"data name"}]

 

Is this correct, and how do I calculate the max and min values properly using the datasheet and the trace delay times from the PCB?

 

Thank you. 

0 Kudos
1 Solution

Accepted Solutions
628 Views
Registered: ‎01-22-2015

Hi Again David,

If the source forwards both data and clock to the receiver then it is called a source-synchronous interface.  If the clock comes from outside (eg. an oscillator on the board) and goes to both source and receiver then it is called a system-synchronous interface (which is not used much anymore and is inferior to the source-synchronous interface).

The interface between your DAC and the FPGA is called source-synchronous Single Data Rate (SDR) output.  The interface consists of:

  • data-clock:  Let's say that this clock is sent out of your FPGA on the pin called clk_out 
  • data:  Let's say that these data are sent out of the FPGA on pins called data_out[n], n=11 downto 0

The set_output_delay constraints you have shown are correct.  The formulas for t_max and t_min are also correct.   

 

Your 200MHz clock should be used to clock digital registers that have be LOC-ed into the IOB - and these registers then forward the data to pins, data_out[n].

 

You need a constraint to define the data-clock which the set_output_delay constraints have called FCLK.  The format of this constraint depends on the architecture in the FPGA that you are using to send out the data-clock.  Normally, we use the ODDR circuit shown below.

ODDR3.jpg

In this circuit, your 200MHz clock is sent to the C-pin of the ODDR and the ODDR forwards the clock to FPGA pin, clk_out.  If you are using this circuit, then the constraint for defining FCLK is shown below.

create_generated_clock -name FCLK -source [get_pins ODDR1/C] -divide_by 1 [get_ports clk_out]

However, the above circuit tends to send the clock and data edge-aligned, which may cause the interface to fail timing analysis.

 

In the above circuit and constraint causes the interface to fail timing analysis then reconfigure the ODDR so that input, D1 is tied low (0) and D2 is tied high (1).  Then, the ODDR will invert the data-clock and the clock and data will be sent roughly center-aligned.   If you do this then the constraint for FCLK becomes:

create_generated_clock -name FCLK -source [get_pins ODDR1/C] -divide_by 1 -invert [get_ports clk_out]

 

If neither of the above methods allow the interface to pass timing analysis then you will need some means to delay either the clock or data.  Let me know if you need help with doing this.

Cheers,
Mark

View solution in original post

5 Replies
711 Views
Registered: ‎09-16-2019

I used this formula from another thread:

  • t_pxd = circuit board trace delay for the data
  • t_pxc = circuit board trace delay for the clock
  • t_sux = setup time for the external register that receives the clock and data
  • t_hdx = hold time for the external register that receives the clock and data
set_output_delay -clock FCLK -max  t_max  [get_ports data_out ]
set_output_delay -clock FCLK -min  t_min  [get_ports data_out ]

where

t_max = max(t_pxd - t_pxc) + t_sux
t_min = min(t_pxd - t_pxc) - t_hdx

 for my device, t_sux = 1ns

                 and t_hdx = 1ns

0 Kudos
pedro_uno
Advisor
Advisor
693 Views
Registered: ‎02-12-2013

Is this for a source synchronous clock scheme?

How do you define FCLK?

----------------------------------------
DSP in hardware and software
-----------------------------------------
0 Kudos
690 Views
Registered: ‎09-16-2019

The DAC is provided a 200 MHz clock from the FPGA. So the "FCLK" has its period parameter set to 5ns. My understanding is that since the FPGA provides the clock to the DAC, this is a system synchronous clock scheme. 

0 Kudos
629 Views
Registered: ‎01-22-2015

Hi Again David,

If the source forwards both data and clock to the receiver then it is called a source-synchronous interface.  If the clock comes from outside (eg. an oscillator on the board) and goes to both source and receiver then it is called a system-synchronous interface (which is not used much anymore and is inferior to the source-synchronous interface).

The interface between your DAC and the FPGA is called source-synchronous Single Data Rate (SDR) output.  The interface consists of:

  • data-clock:  Let's say that this clock is sent out of your FPGA on the pin called clk_out 
  • data:  Let's say that these data are sent out of the FPGA on pins called data_out[n], n=11 downto 0

The set_output_delay constraints you have shown are correct.  The formulas for t_max and t_min are also correct.   

 

Your 200MHz clock should be used to clock digital registers that have be LOC-ed into the IOB - and these registers then forward the data to pins, data_out[n].

 

You need a constraint to define the data-clock which the set_output_delay constraints have called FCLK.  The format of this constraint depends on the architecture in the FPGA that you are using to send out the data-clock.  Normally, we use the ODDR circuit shown below.

ODDR3.jpg

In this circuit, your 200MHz clock is sent to the C-pin of the ODDR and the ODDR forwards the clock to FPGA pin, clk_out.  If you are using this circuit, then the constraint for defining FCLK is shown below.

create_generated_clock -name FCLK -source [get_pins ODDR1/C] -divide_by 1 [get_ports clk_out]

However, the above circuit tends to send the clock and data edge-aligned, which may cause the interface to fail timing analysis.

 

In the above circuit and constraint causes the interface to fail timing analysis then reconfigure the ODDR so that input, D1 is tied low (0) and D2 is tied high (1).  Then, the ODDR will invert the data-clock and the clock and data will be sent roughly center-aligned.   If you do this then the constraint for FCLK becomes:

create_generated_clock -name FCLK -source [get_pins ODDR1/C] -divide_by 1 -invert [get_ports clk_out]

 

If neither of the above methods allow the interface to pass timing analysis then you will need some means to delay either the clock or data.  Let me know if you need help with doing this.

Cheers,
Mark

View solution in original post

576 Views
Registered: ‎09-16-2019

Thank you for your help. I will do a build with this information and let you know if I have any issues. 

0 Kudos