07-19-2018 08:39 AM
Hi, Is there a way to measure the clock frequency of the FPGA? What I mean is I am currently using my FPGA for a project and I read somewhere that the FPGA clock has an inaccuracy of +/-50 ppm. I have increased the clock frequency to 150 MHz and I guess it means the clock inaccuracy is tripled as well. So I think the clock cycle is not exactly 150 MHz. But how do I measure it?
07-19-2018 12:23 PM
The inaccuracy is a function of the oscillator on your board. The FPGA itself does not generate a clock. If you triple the frequency, the error will be three times as large, but since the frequency is three times as large, the error expressed in ppm should still be the same. Depending on how you triple the frequency (DCM vs PLL) the jitter may increase, but that's another topic.
All you need to measure the frequency is a good frequency counter. They can be pricey,so if you are a college student, take your board to an electronics lab. Forward the clock from your FPGA to the counter (AC couple it) and see what it says. The time base in good test equipment will be better than the oscillator on your board, so you should get a pretty good idea.
No frequency counter? You need a very accurate, stable clock signal. You might be able to get one out of some GPS equipment. Multiply it up to your oscillator frequency and find a way to compare the two.The old fashioned way is turn them into sine waves and mix them, then look at the beat frequency.
A fairly easy digital approach is to run two long counters, say 48 or 64 bits each, one off of each clock and stop them when one rolls over and compare the counts.
07-19-2018 09:43 PM - edited 07-19-2018 11:16 PM
I use the following design rules when creating my boards:
The second rule means that I can always compare the frequency ratios of at least two clocks inside each FPGA during manufacturing tests.
I thought about using the internal configuration clock as a reference (hey, it's free!), but it doesn't have the accuracy I require.
EDIT: I also try to ensure there is at least one reasonably accurate oscillator on each board. TXCOs with initial tolerances better than +/- a few ppm are small, low powered, cheap and readily available (thank you, mobile phone driven technology).
EDIT2: with regard to the oscillator tolerance required for Ethernet, the IEEE802.3 specification does say +/- 100ppm in a number of places. It also says that frames can be at most (about) 1.5kB long. The clock rate adjustment circuit in each Ethernet Rx port needs to deal with +/- (2 * 100ppm * 1500Bytes/frame) = +/- 0.3 Bytes per frame, and many circuits I've encountered happily deal with that but may have issues dealing with adjustments greater than +/- 1 Byte per frame.
These days (or even a decade ago), you probably want to have network interfaces that support (9kB) jumbo frames. Consider that +/- (2 * 100ppm * 9kB/frame) = +/- 1.8B / frame. This may upset many rate adjustment circuits. One solution is to use clocks with a much tighter tolerance than the w.c. value the IEEE specify. I usually use +/- 20ppm (or tighter if I have a TXCO) for Ethernet.
07-20-2018 08:09 AM - edited 07-20-2018 08:12 AM
Hi @bruce_karaffa, thank you for your reply. The way I have tried to do is get the clock output from one of the IO pins on the board and connect it to oscilloscope. Is that different or maybe gives less accurate measure than what you are suggesting?
07-20-2018 08:29 AM
You have to realize that the difference between 150 MHz and 150.0075 MHz (150 + 50 ppm) is less than a picosecond. You need to integrate over a fairly long period to differential the two. An oscilloscope won't show you such small differences unless it has a good built in frequency counter. If your scope can display a frequency, how many digits of precision does it show?
08-15-2018 04:05 AM