08-04-2016 01:41 PM
Hi Xilinx community,
I am working on a simple design involving two gmii_to_rgmii_converter IP blocks, and two AXI4_stream_data_fifos. I have the GMII/RGMII blocks connected two two ethernet ports on the ethernetFMC card, and I am trying to map the GMII outputs of the RGMII/GMII blocks to AXI stream interfaces so I can access them using HLS later on. I use the FIFO to appropriately cross clock domains.
My problem is that every time that I try to add an ILA debug core in order to verify an aspect of the design, it fails timing. I can't seem to come up with a design that doesn't fail in the presence of a debug core, and I am starting to think that I am doing something wrong with my constraints, as the design is very simple. AFAIK all clocks are being placed on clock buffers, and the constraints should be ok, so I don't understand why the design is failing. I am a rookie when it comes to looking at timing reports, so any help would be greatly appreciated. The project can be found at the link below, and I have attached a copy of the implementation timing report.
I have read that some people experience the same issue with ILA, and have solved it by "moving all ILA cores to their own clock region". How can I do this? I can't find anything online that explains how to manipulate clock regions.
Timing Report is attached.
Project .zip can be found here: https://www.dropbox.com/s/p2jy85roy9nav3o/zedboard_network_tap_FAIL_TIMING.xpr.zip?dl=0
08-04-2016 02:12 PM
These failing paths are not "right"...
From the report, it looks as if it is considering the BUFR as the start of a timing path, which it isn't - the BUFR should be considered as a propagation element on the source clock delay.
I don't know why it is doing this - it is most likely due to something wrong in a constraint file... (like a set_input_delay or set_max_delay on the BUFR pins themselves...)
I do have to point out that it is "odd" to have a BUFR feed a BUFG; this is not a recommended clock connection, and will have large and unpredictable clock insertion - but that isn't what is causing this failure...
Look through all the .xdc files associated with the project - both your user .xdc files as well as all the .xdc files associated with the IP (the RGMII/GMII converter as well as the ILA).
08-08-2016 11:29 AM
@avrumw Thanks so much for the response! Apologies for the delay, I've been out of town. Back in the office now.
I am an experienced embedded systems programmer, but am somewhat new to advanced FPGA design so I have a few clarifying questions:
"I do have to point out that it is "odd" to have a BUFR feed a BUFG; this is not a recommended clock connection, and will have large and unpredictable clock insertion - but that isn't what is causing this failure..."
"Look through all the .xdc files associated with the project - both your user .xdc files as well as all the .xdc files associated with the IP (the RGMII/GMII converter as well as the ILA)."
I've attached the .xdc files that I think are relevant, as well as the OOC versions (just in case). Any insight you could provide would be greatly appreciated.