08-07-2017 09:20 AM
Hello,
I would like to implement a remote debugging solution for our Kintex UltraScale designs using XVC. I've read through all of the available answer records, user guides, product guides and forum posts related to XVC. It seems that the Debug Bridge IP v2.0 core in the From_AXI_to_BSCAN configuration is the best solution for our needs. However, there is no documentation regarding the AXI interface for this core.
1. Can the stream of data read from the network socket be sent directly to the AXI interface, or is user logic required to parse and implement the XVC commands (a la XAPP1251)? And it's the latter case, what data then should be sent into the Debug Bridge core's AXI interface?
2. A few forum posts indicate that the hw_server in newer versions of Vivado use as undocumented XVC commands. Can you please explain how these new commands are used?
XVC seems like a powerful and flexible remote debugging solution. Please provide a little more information about the interface so that we can make use of it.
Thanks,
Dave
08-15-2017 07:35 PM
Quick test with 2017.2 and the ZCU102 ES2 works:
tactical@tactical-VirtualBox:~$ xsct ****** Xilinx Software Commandline Tool (XSCT) v2017.2 **** Build date : Jun 15 2017-18:45:22 ** Copyright 1986-2017 Xilinx, Inc. All Rights Reserved. xsct% connect tcfchan#0 xsct% targets 1 PS TAP 2 PMU 3 PL 4 PSU 5 RPU 6 Cortex-R5 #0 (Halted) 7 Cortex-R5 #1 (Lock Step Mode) 8 APU 9 Cortex-A53 #0 (Running) 10 Cortex-A53 #1 (Running) 11 Cortex-A53 #2 (Running) 12 Cortex-A53 #3 (Running)
I'm then getting random errors which I suspect are just due to running 30MHz JTAG over a spaghetti pile of cables from the FT4232 mini module.
08-07-2017 11:28 PM
1. I think the answer is that there is an xvcd (XVC daemon) process needed which interprets the commands and reads/writes to the hardware. There's a zynq reference design somewhere which uses that bridge core but it runs xvcd on the target hardware, I don't know how the Kintex communications would work.
2. I use a version of xvcd running on my development host, derived from a modified xvcd here which talks to an FT4232 chip and uses it as a custom JTAG interface. Once that's runnning the hw_server is run with:
hw_server -e "set auto-open-servers xilinx-xvc:localhost:2542"
And then I have to run xsdb and
xsdb% connect
After that the SDK, debuggers and Vivado Hardware Manager can find the JTAG interface.
08-11-2017 06:19 AM - edited 08-15-2017 05:55 PM
Thanks for your response. Have you had success using this remote debugging scheme with 2017.x versions of Vivado?
08-15-2017 05:59 PM
I'm not sure if I've tried 2017.x yet. If I do I'll update here.
08-15-2017 07:35 PM
Quick test with 2017.2 and the ZCU102 ES2 works:
tactical@tactical-VirtualBox:~$ xsct ****** Xilinx Software Commandline Tool (XSCT) v2017.2 **** Build date : Jun 15 2017-18:45:22 ** Copyright 1986-2017 Xilinx, Inc. All Rights Reserved. xsct% connect tcfchan#0 xsct% targets 1 PS TAP 2 PMU 3 PL 4 PSU 5 RPU 6 Cortex-R5 #0 (Halted) 7 Cortex-R5 #1 (Lock Step Mode) 8 APU 9 Cortex-A53 #0 (Running) 10 Cortex-A53 #1 (Running) 11 Cortex-A53 #2 (Running) 12 Cortex-A53 #3 (Running)
I'm then getting random errors which I suspect are just due to running 30MHz JTAG over a spaghetti pile of cables from the FT4232 mini module.
10-29-2018 04:59 AM - edited 10-29-2018 05:00 AM
@lcameron, @david.hoffman, thanks for sharing this info, did you get any further on this in the mean time? Did you fix the USB JTAG issues?
For Zynq devices, it would be great if XVC would support not only Vivado hardware debug like ILA and VIO, but full remote SDK debug by putting a networked controller next to a Zynq device (much like the new SmartLynq cable from Xilinx)
I opened 2 related forum posts :
* UG908 contains contradictory information on Xilinx Virtual Cable (XVC)
* will Xilinx Virtual Cable (XVC) support remote debug of the ARM cores in the future?
I got some feedback from Xilinx on this, but I'm waiting for more :-)