cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
squaringcircle
Adventurer
Adventurer
1,300 Views
Registered: ‎12-19-2018

input delay constraints required if IDELAY used?

Jump to solution

Hello,

 

I'm working on a DDR interface using IDELAYE2 components for all data pins.

Everything is working ok so far doing an auto calibration.

However, I do not have any constraints except the incoming data clock set (e.g. input delay comapred to clock, etc).

 

Is this ok or is there a better way of defining this design?

 

Thanks for any feedback or oppinion.

Tags (2)
0 Kudos
1 Solution

Accepted Solutions
hemangd
Moderator
Moderator
1,298 Views
Registered: ‎03-16-2017

Hi @squaringcircle ,

It is recommended to constraint the I/O ports of the FPGA. Where Input /Output delay constraints are required. 

By applying input delay constraints on it tool will get to know that yes, this is a timing path and it will get counted based on the setup and hold values written from the upstream device's. 

Regards,
hemangd

Don't forget to give kudos and mark it as accepted solution if your issue gets resolved.

View solution in original post

0 Kudos
5 Replies
hemangd
Moderator
Moderator
1,299 Views
Registered: ‎03-16-2017

Hi @squaringcircle ,

It is recommended to constraint the I/O ports of the FPGA. Where Input /Output delay constraints are required. 

By applying input delay constraints on it tool will get to know that yes, this is a timing path and it will get counted based on the setup and hold values written from the upstream device's. 

Regards,
hemangd

Don't forget to give kudos and mark it as accepted solution if your issue gets resolved.

View solution in original post

0 Kudos
viviany
Xilinx Employee
Xilinx Employee
1,286 Views
Registered: ‎05-14-2008

If you have the IDELAYE2 working in "VARIABLE" mode since you mentioned "auto calibration", then you don't need input delay constraint for the interface.

If IDELAYE2 is working in "fixed" mode, input delay constraint is needed to check if the timing on the interface is met or not.

-vivian

-------------------------------------------------------------------------------------------------
Don’t forget to reply, kudo, and accept as solution.
-------------------------------------------------------------------------------------------------
如果提供的信息能解决您的问题,请标记为“接受为解决方案”。
如果您认为帖子有帮助,请点击“奖励”。谢谢!
-------------------------------------------------------------------------------------------------
0 Kudos
squaringcircle
Adventurer
Adventurer
1,282 Views
Registered: ‎12-19-2018

Hello @viviany ,

 

Thank you for clariying.

You're right - we're using the variable load mode for the idelay and calibrate to a known data pattern.

Thanks for clarifying.

0 Kudos
avrumw
Guide
Guide
1,259 Views
Registered: ‎01-23-2009

You're right - we're using the variable load mode for the idelay and calibrate to a known data pattern.

Even in this case you probably want some constraints (even though you cannot do "static" timing analysis on a dynamically calibrated interface).

Without a constraint, the tools will flag the pins with the "no_input_delay" warning done by "check_timing". If you want to make these go away then the "normal" way of doing this is putting a set_input_delay (with any value) on the pin (to keep "check timing" happy) and also put a set_false_path -from [get_ports <the ports>] to disable the timing check (and clearly indicate that this is intentional - just having no set_input_delay could be an oversight, but the set_false_path makes it really clear).

Avrum

0 Kudos
squaringcircle
Adventurer
Adventurer
1,234 Views
Registered: ‎12-19-2018

Hi @avrumw 

 

Thank you very much for clarifying.

It makes sense to have a clean definition in order to avoid any warnings or issues on the timing check.

0 Kudos