03-25-2013 09:32 AM
I have had this problem using both the FIR core gen and the DDS core gen where I am getting a staircased 6 microsecond output regardless of system clock frquency or output frequency settings.
The simpler case is the DDS, I have tested in a realtime system using a logic analyser and have tried system clock of both 20 Mhz and 120 Mhz, with output frequency setting of 1 Mhz , 20 Mhz and 60 Mhz with the output rate always fixed at approx 6 uSec per sample.
How can I get better resolution on the phase of the output? I see no reason that I shouldn't be able to get 16 nanosecond phase increments with a 120 MHz system clock.
03-25-2013 09:36 AM
03-25-2013 09:41 AM
Yes I am changing both the input clock (wired to input of core) and the parameter in the GUI
This is not a simulation - it is the actual logic analyser capture on an output test connector of the system under development, and yes the sampling rate is sufficient, I have sampled at 20 Mhz and at a finer resolution of 500Mhz.
03-27-2013 09:49 PM
I would double check the input clock frequency to the DDS core running in HW. Bring it out to a pin and look at it on a scope.
It isnt a matter of expecting the .xco's to be different, I just want one that works!
03-28-2013 12:09 AM
Ok I tried a higher input clock as suggested - bumped it up to 180MHz and got a reasonable smooth output.
Then i gradually worked back down - 160, 140 still looked good, then back to the original 120 and it looks good. So I really have no idea what fixed it or what was wrong.
Thanks for the input, resolved for now. If it reoccurs I will repost.