UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

cancel
Showing results for 
Search instead for 
Did you mean: 
Observer c_oflynn
Observer
6,977 Views
Registered: ‎02-24-2009

Non-Dedicated Clock Input Pin to DCM: Why is this Bad?

Jump to solution

I'm trying to build a small add-on board for the Avnet S6 LX9 Microboard. This board has two extension headers - but neither of them have a GCLK input pin.

 

My board will take a clock input signal (range about 16-64 MHz), pass it through a DCM, and use that signal to clock an external ADC & internal BRAM-based FIFO. The DCM will be used for both frequency synthesis (4x clock mul) along with dynamic phase adjustment.

 

I don't care about the absolute phase relationship between source clock & output of DCM. I'll be adjusting the phase with the DCM anyway to line up ADC samples with events occurring on the external clock. Once the samples are lined up I want that phase relationship to remain constant.

 

So - what is the actual downside of using a generic input pin instead of GCLK pin? I haven't been able to find much detail in why doing this is bad, just that it is. I'd like to get an idea if I should start looking for alternative hardware, or if I will be able to get away with what I currently have planned. Does it mean increased jitter, less predictable delay with voltage/temp change, etc? Some downsides I can live with in my application, but I don't even know what they are exactly!

 

Thanks,

 

    -Colin

0 Kudos
1 Solution

Accepted Solutions
Scholar austin
Scholar
9,211 Views
Registered: ‎02-27-2008

Re: Non-Dedicated Clock Input Pin to DCM: Why is this Bad?

Jump to solution

Colin,


If you do not care about absolute phase, then there is no problem.


The IBUF, dedicated clock pin route has a controlled, known delay (for the tools), and a general IO pin, using hte regular fabric interconnect will have a delay that varies more, but is really no different in performance, than the dedicated route.


It is true that if you used regular interconnect for all of your clock routing, you could have more jitter, but the short path from IOB to the nearest global buffer access point (or DCM), is not enough of a span to introduce jitter.  By choosing the same resources for the DCM feedback path, you may actually compensate for the variations.  This may require hand placement of hte routes in FPGA_editor, but could provide a better solution (at some cost of time and effort, and future maintainability).

 

You will get a warning, or more than one warning, which you may ignore.

 

The delay will change with process (chip to another chip), voltage, and temperature;  but then, it would do so for the IBUF as well.  The use of the BUFG for the DCM feedback is designed to compensate for this.  If you are not pushing the performance too far, you will be able to choose a fixed delay sampling point.  Otherwise you will have to have a sort of training sequence to find the best spot to sample for each board, and perhaps re-calibrate if the voltage/temperature changes (like what is done for the DDR memory devices at very high speeds).

 

 

Austin Lesea
Principal Engineer
Xilinx San Jose
5 Replies
Scholar austin
Scholar
9,212 Views
Registered: ‎02-27-2008

Re: Non-Dedicated Clock Input Pin to DCM: Why is this Bad?

Jump to solution

Colin,


If you do not care about absolute phase, then there is no problem.


The IBUF, dedicated clock pin route has a controlled, known delay (for the tools), and a general IO pin, using hte regular fabric interconnect will have a delay that varies more, but is really no different in performance, than the dedicated route.


It is true that if you used regular interconnect for all of your clock routing, you could have more jitter, but the short path from IOB to the nearest global buffer access point (or DCM), is not enough of a span to introduce jitter.  By choosing the same resources for the DCM feedback path, you may actually compensate for the variations.  This may require hand placement of hte routes in FPGA_editor, but could provide a better solution (at some cost of time and effort, and future maintainability).

 

You will get a warning, or more than one warning, which you may ignore.

 

The delay will change with process (chip to another chip), voltage, and temperature;  but then, it would do so for the IBUF as well.  The use of the BUFG for the DCM feedback is designed to compensate for this.  If you are not pushing the performance too far, you will be able to choose a fixed delay sampling point.  Otherwise you will have to have a sort of training sequence to find the best spot to sample for each board, and perhaps re-calibrate if the voltage/temperature changes (like what is done for the DDR memory devices at very high speeds).

 

 

Austin Lesea
Principal Engineer
Xilinx San Jose
Observer c_oflynn
Observer
6,962 Views
Registered: ‎02-24-2009

Re: Non-Dedicated Clock Input Pin to DCM: Why is this Bad?

Jump to solution
Hi Austin, Thanks for the information - that is basically exactly what I was hoping to hear! The variance with device/temperature/voltage isn't an issue, as the recalibration would be occurring for different inputs to the module anyway. It was only jitter I was really worried about, and it sounds like in my case that either won't be an issue or something I can manage with minimal effort. Warm Regards, -Colin O'Flynn
0 Kudos
Adventurer
Adventurer
6,940 Views
Registered: ‎01-21-2011

Re: Non-Dedicated Clock Input Pin to DCM: Why is this Bad?

Jump to solution

I do this many times, to use non gclk pin for input clock

 

IMO, the internal routing from general IOB to the dedicate buffer GBUF is still better than the its trace/segments on the PCB.

 

-----------

 

For some reason my boards end up with zig-zac trace on the clock net that feeds into the FPGA... When I talk to the board development's, I've got the answer  for "data & clock matching length" ???  But I always have to calibrate the clock phase to meet timing at the IOB

 

 

 

 

0 Kudos
Observer c_oflynn
Observer
6,910 Views
Registered: ‎02-24-2009

Re: Non-Dedicated Clock Input Pin to DCM: Why is this Bad?

Jump to solution

A little nerd porn... completed ADC board with variable gain amplifier (~10-40dB gain), should make that cheap Avnet board handy for interfacing with the real world! ADC Board (Newae.coM)

0 Kudos
Highlighted
Newbie gaurav7931
Newbie
4,701 Views
Registered: ‎12-09-2014

Re: Non-Dedicated Clock Input Pin to DCM: Why is this Bad?

Jump to solution

May i know how much delay is introduced, from non-clk pin to global buffer.??

Tags (1)
0 Kudos