cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Highlighted
Adventurer
Adventurer
5,786 Views
Registered: ‎07-08-2016

Timing failure: help with constraints and clock buffers

Hi Xilinx community, 

 

I am working on a simple design involving two gmii_to_rgmii_converter IP blocks, and two AXI4_stream_data_fifos. I have the GMII/RGMII blocks connected two two ethernet ports on the ethernetFMC card, and I am trying to map the GMII outputs of the RGMII/GMII blocks to AXI stream interfaces so I can access them using HLS later on. I use the FIFO to appropriately cross clock domains.

 

My problem is that every time that I try to add an ILA debug core in order to verify an aspect of the design, it fails timing. I can't seem to come up with a design that doesn't fail in the presence of a debug core, and I am starting to think that I am doing something wrong with my constraints, as the design is very simple. AFAIK all clocks are being placed on clock buffers, and the constraints should be ok, so I don't understand why the design is failing. I am a rookie when it comes to looking at timing reports, so any help would be greatly appreciated. The project can be found at the link below, and I have attached a copy of the implementation timing report. 

 

I have read that some people experience the same issue with ILA, and have solved it by "moving all ILA cores to their own clock region". How can I do this? I can't find anything online that explains how to manipulate clock regions. 

 

Timing Report is attached.  

Project .zip can be found here: https://www.dropbox.com/s/p2jy85roy9nav3o/zedboard_network_tap_FAIL_TIMING.xpr.zip?dl=0

 

Thanks

0 Kudos
2 Replies
Highlighted
Guide
Guide
5,775 Views
Registered: ‎01-23-2009

These failing paths are not "right"...

 

From the report, it looks as if it is considering the BUFR as the start of a timing path, which it isn't - the BUFR should be considered as a propagation element on the source clock delay.

 

I don't know why it is doing this - it is most likely due to something wrong in a constraint file... (like a set_input_delay or set_max_delay on the BUFR pins themselves...)

 

I do have to point out that it is "odd" to have a BUFR feed a BUFG; this is not a recommended clock connection, and will have large and unpredictable clock insertion - but that isn't what is causing this failure...

 

Look through all the .xdc files associated with the project - both your user .xdc files as well as all the .xdc files associated with the IP (the RGMII/GMII converter as well as the ILA).

 

Avrum

0 Kudos
Highlighted
Adventurer
Adventurer
5,563 Views
Registered: ‎07-08-2016

@avrumw Thanks so much for the response! Apologies for the delay, I've been out of town. Back in the office now. 

 

I am an experienced embedded systems programmer, but am somewhat new to advanced FPGA design so I have a few clarifying questions: 

 

"I do have to point out that it is "odd" to have a BUFR feed a BUFG; this is not a recommended clock connection, and will have large and unpredictable clock insertion - but that isn't what is causing this failure..." 

  1. Could you please explain why this is not recommended?
  2. The BUFR --> BUFG combo was inferred during synthesis of my block design.....could you please explain why the toolchain inferred this and how I might avoid a similar situation in the future?

 

"Look through all the .xdc files associated with the project - both your user .xdc files as well as all the .xdc files associated with the IP (the RGMII/GMII converter as well as the ILA)."

  1.  what exactly should I be looking for in these files that would make the BUFR a source element rather than a propagation element? The set_input_delay etc. commands all seem to make sense to me (which doesn't mean they are correct however...I'm still learning!)

I've attached the .xdc files that I think are relevant, as well as the OOC versions (just in case). Any insight you could provide would be greatly appreciated.

 

Thanks again!

0 Kudos