cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Highlighted
Visitor
Visitor
271 Views
Registered: ‎08-06-2020

Delaying an incoming signal

What is the best way to delay an incoming signal so as to have a lot of very repetitive (ideally always equal) delay through a great number of stages? And this delayed signal should be able to be latched by e.g. flip-flops along the delay line. Like in various TDC schemes, for instance.

0 Kudos
4 Replies
Highlighted
Scholar
Scholar
254 Views
Registered: ‎05-21-2015

Do you want to delay by an integer number of clocks?  Then this technique might work for you.  A fractional number of clocks?  Then look up the IDELAYE2 primitive.  A fractional number of clocks?  Then you might need to do some form of sub-sample interpolation.

Dan

0 Kudos
Highlighted
Visitor
Visitor
226 Views
Registered: ‎08-06-2020

 

Hi Dan,

It is a fractional number of clocks or even a smaller delay.

I'll look up the IDELAYE2 primitive indeed, thanks.

How do you perform a sub-sample interpolation?

 

Ermanno

0 Kudos
Highlighted
Visitor
Visitor
220 Views
Registered: ‎08-06-2020

 

To be more precise, the delay I'm looking for is some tens of picoseconds.

0 Kudos
Highlighted
Scholar
Scholar
217 Views
Registered: ‎05-21-2015

@Ermanno,

Look up the IDELAYE2 primitive then.

As for interpolating fractional delays, that works more on samples than on logic.  You can read some of the theory behind it here--together with finding some examples of how to do it in the same repo.

Dan

0 Kudos