Don't have a Xilinx account yet?

  • Choose to receive important news and product information
  • Gain access to special content
  • Personalize your web experience on Xilinx.com

Create Account

Username

Password

Forgot your password?
XClose Panel
Xilinx Home
Reply
Super Contributor
nju044
Posts: 112
Registered: ‎11-11-2009
0

help on chipscope

[ Edited ]

hi, all:

 

I need to debug some design errors with chipscope. However, after inserting the Chipscope Insertor to the netlist, the implementation of the project seems to be different (by this, i mean the timing, because the indicating LED status goes wrong). 

 

I've search the forum with key words "chipscope timing" and found the following solution :

Try to reduce the complexity of Chipscope settings or try to use area group constraints to resolve the timing errors. 

 

But I need to know the real circuit working process, so this solution seems not practical. How to maintain the origin implementation result when there's no chipscope ILA in the design. I currently use ISE DS 12.3.

 

PS: i found the startup clock set to be CCLK will cause chipscope configuration failure, while the startup clock set to be JTAG CLK won't. So I use JTAG CLK as start-up clock. Will this causes the timing error?(forgive me if this question is too idiotic~~) 

Best Regard.

Ninos K.
Expert Contributor
rcingham
Posts: 2,114
Registered: ‎09-09-2010
0

Re: help on chipscope

It is generally true that adding ChipScope to a design that just passes Static Timing Analysis can cause it to fail, if this is what you mean. This is because the extra logic for the ILA etc. distorts the 'ideal' placement/routing that meets STA. A good thing to try in this case is to P&R the design without ChipScope, turn on SmartGuide, add ChipScope, and re-run Map/P&R. What might happen is that the 'real' logic still meets timing, but some signals in ChipScope are one clock out, which does make it hard to be sure what is going on...

Remember your Quantum Mechanics classes - observing something changes it!

------------------------------------------
"If it don't work in simulation, it won't work on the board."
Super Contributor
nju044
Posts: 112
Registered: ‎11-11-2009
0

Re: help on chipscope

hi, rcingham:

 

Thanks for your answer. I did as you told me and it didn't work~~ I think this may be caused by chipscope, so I remove chipscope from project, only remaining smartguide. However, the design with smartguide still goes wrong. After I disabled smartguide the project worked with the right indication signal. But in this situation, I can not know what's going on inside the FPGA.

 

I wonder if there's another method more controllable. After all, this smartguide may be blind to what I really mean sometimes. I remember the partition or something like that may help to maintain the original P&R result, then you can add other logic as you like. Is there any tutorial I  can refer to ?

 

Also, about the JTAG CLK, is it appropriate to be the start-up clock? During the above procedure, I set CCLK as start-up clock.

Best Regard.

Ninos K.
Expert Contributor
hgleamon1
Posts: 1,221
Registered: ‎11-14-2011
0

Re: help on chipscope

I am not so sure on your Chipscope issues - it sounds like you have a very busy design. Is your design fully constrained with regards to timing? Is it possible to implement and Chipscope piece by piece rather than the whole design (i.e. do you really need to look at all of those signals at the same time)?

 

The JTAGCLK should be used as a startup clock if you are configuring the FPGA via the boundary scan port. If you are configuring the FPGA by any Master or Serial option (i.e. from a PROM or another intelligent device like a microcontroller), you should be using the CCLK.

 

Regards,

 

Howard

 

 

----------
"That which we must learn to do, we learn by doing." - Aristotle
Super Contributor
nju044
Posts: 112
Registered: ‎11-11-2009
0

Re: help on chipscope

[ Edited ]

hi, Howard:

 

Thank you for the reply.

 

"The JTAGCLK should be used as a startup clock if you are configuring the FPGA via the boundary scan port. If you are configuring the FPGA by any Master or Serial option (i.e. from a PROM or another intelligent device like a microcontroller), you should be using the CCLK. "

---- I found a lab telling how to use chipscope to debug design (oriented to starters).  Although configuring FPGA via chipscope requires JTAG, the project in the lab uses CCLK as start-up clock and it works well. So I'm a little confused by what you said.

 

Also, I've seen a document saying "In order to use chipscope analyzer, you have to set Device Configuration Clock = JTAG CLK". Does the "Configuration Clock" here refer to "Start-up Clock" in ISE?

Maybe my "configuration" and your "configuration" has different meaning? I think "configuring via the boundary scan port " means to program .bit into FPGA; "configuring the FPGA by any Master or Serial option" means to program .mcs into external FLASH, then to load FLASH data into FPGA. And how the start-up clock can effect these procedures?

Best Regard.

Ninos K.
Expert Contributor
hgleamon1
Posts: 1,221
Registered: ‎11-14-2011
0

Re: help on chipscope

The "startup clock" is the clock the FPGA uses to clock the bitstream in to configure the device. It is only relevant for configuration (i.e. "programming" the FPGA). Thus, if you load the bitstream through JTAG, you should set the startup clock to be JTAGCLK. If the bitstream is loaded by another method, you should use CCLK.

 

Although configuring FPGA via chipscope

 

You can do this? This is a "feature" of Chipscope that I never knew about.

 

Also, I've seen a document saying "In order to use chipscope analyzer, you have to set Device Configuration Clock = JTAG CLK

 

I don't really understand this. I thought that a Chipscope core was added at a netlist level, meaning that you must Translate again in order to be able produce a new .bit file, which may then be used to configure the FPGA. The core is accessed (via JTAG) AFTER the device is up and running, meaning that the startup clock isn't relevant any more. Which document are you referring to?

 

Anyway, can you reduce the size of your Chipscope core and get the performance results you want? Have you confirmed that your design is fully constrained (I thought that by constraining the design then it should be unaffected by Chipscope in terms of timing).

 

Regards,

 

Howard

 

----------
"That which we must learn to do, we learn by doing." - Aristotle
Expert Contributor
rcingham
Posts: 2,114
Registered: ‎09-09-2010
0

Re: help on chipscope

"I thought that by constraining the design then it should be unaffected by Chipscope in terms of timing"

I don't think so (but I've been wrong before...).
ChipScope adds logic, and needs routing resources to get the signals-to-be-observed to its logic. This distorts the operational logic, and so can (and does,in my experience) break timing closure in a busy design.

------------------------------------------
"If it don't work in simulation, it won't work on the board."
Expert Contributor
hgleamon1
Posts: 1,221
Registered: ‎11-14-2011
0

Re: help on chipscope

ChipScope adds logic, and needs routing resources to get the signals-to-be-observed to its logic

 

I did think about this after I posted my last message. I had made a (rather large) assumption that the tools would (somehow) prioritise the constrained logic and then shoehorn the Chipscope around it. Practically, though, I can't imagine how it would be able to do this :smileyindifferent:

 

As a slight digression (sorry) - just how "busy" does a design need to become before this "observer effect" is found?

 

Regards,

 

Howard

 

----------
"That which we must learn to do, we learn by doing." - Aristotle
Super Contributor
nju044
Posts: 112
Registered: ‎11-11-2009
0

Re: help on chipscope

[ Edited ]

You can do this? This is a "feature" of Chipscope that I never knew about.

 

By "configuring FPGA via chipscope"(I will use this saying in the following context), I mean laoding the bitstream (via chipscope) into FPGA to configure the device. 

 

I don't really understand this. I thought that a Chipscope core was added at a netlist level, meaning that you must Translate again in order to be able produce a new .bit file, which may then be used to configure the FPGA. The core is accessed (via JTAG) AFTER the device is up and running, meaning that the startup clock isn't relevant any more. Which document are you referring to?

 

The configuration clock means start-up clock, I think. 

 

To sum up ( just to check if I grasp what you're telling me), configuration can be done via JTAGCLK and CCLK. Use chipscope to configure device requires JTAG, so the start-up clock should be JTAGCLK ( according to my experience, using CCLK as start-up clock results in configuration failure ). After configuration, JTAGCLK is used to acceess the core ILA.

 

Have you confirmed that your design is fully constrained.


I will make the design fully constrained, then provide your further information. Thank you very much!

Best Regard.

Ninos K.
Expert Contributor
hgleamon1
Posts: 1,221
Registered: ‎11-14-2011
0

Re: help on chipscope

By "configuring FPGA via chipscope"(I will use this saying in the following context), I mean laoding the bitstream (via chipscope) into FPGA to configure the device. 

 

I still don't understand how this is possible. But never mind, it is off the main thread topic.

 

Regarding the startup clock, I believe we are talking about the same thing just, possibly, from slightly different viewpoints.

 

Regards,

 

Howard


----------
"That which we must learn to do, we learn by doing." - Aristotle