UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

cancel
Showing results for 
Search instead for 
Did you mean: 

Designing an XVC project for Remote Debugging of Zynq UltraScale+ devices

Moderator
Moderator
4 0 575

Introduction

Xilinx offers a wide range of IPs for hardware debugging of FPGAs and SoC resources, such as the ILA, VIO, and IBERT cores. Traditionally,  such cores are accessed via JTAG connection, which requires the device to be locally and physically accessible.

this constraint is impracticable in many applications where the FPGA is in a hard-to-access location, there is no direct access to the FPGA JTAG pins, or the system is deployed in the field.

To resolve such issues, Xilinx introduced the Xilinx Virtual Cable (XVC), which is a TCP/IP-based protocol that acts like a JTAG cable and provides a means to access and debug the FPGA or SoC design without using a physical cable.

The Vivado Design

This tutorial will target the remote debugging of a Zynq® UltraScale+™ MPSoC ZCU102 Evaluation Kit board and will use Vivado and PetaLinux 2018.3 as development tools. All of the Vivado development will be done using the Block Design Interface.

In our example design, we wish to debug the outputs of the Binary Counter IP, which is configured to be a simple 8-bit counter. Its outputs are also connected to LEDs on the board, which will light up according to the counting progress. A System ILA is an adequate debug core for this  purpose.

The Zynq UltraScale+ MPSoC Block has also been added to the canvas, in order to enable the use of PS resources in the design. We will also make use of the PS generated Clock for all of the blocks in the project.

Adding and connecting these IPs together result in a design similar to the image below:

BD_1.png

The first step to enable XVC is to integrate the remote debugging architecture into the PL design. The Debug Bridge is the main core for this functionality. It provides a mechanism to establish the communication between the debug cores and non-JTAG interfaces (for example, Ethernet/PCIe).

For this application, the XVC communication will be received via TCP/IP by the PS side and transmitted to the PL side via the AXI protocol.

As a result, the Debug Bridge is going to be configured in AXI to BSCAN mode.

Figure_2.png

In the BSCAN Options tab, the JTAG Fallback Mode is also enabled.

This function allows for communication with the Debug Cores to be established via local JTAG connection, in case the XVC connection becomes unstable or unresponsive.

Figure_3.png

Because the JTAG Fallback option has been enabled, an output port mo_bscan has been enabled in the Debug Bridge.

A second Debug Bridge core, in the mode BSCAN to Debug_Hub is required to finalize and enable the feature.

After inserting and configuring the Debug Bridge cores, the “Run Connection Automation” option needs to be enabled in the Block Design.

Figure_4.png

The Connection Automation Wizard will insert the IP blocks and perform the connections required to enable the communication between the Debug Bridge IP and the PS.

Figure_5.png

The final Block Diagram can be seen below:

Figure_6.png

At this point, the design is finalized. Before proceeding, click on the Validate Design button to have Vivado check if any connections were missed or any misconfigurations were made.

If all of the configurations were correctly made, a 'Validation successful' message is issued.

Figure_7.png

In the Sources window, right-click on the design_1.bd file and select the “Create HDL Wrapper” option.

Figure_8.png

In the Flow Navigator, click on “Generate Bitstream”. When prompted, allow Vivado to Run Synthesis and Implementation.

Once the Bitstream has been successfully generated, we need to create the Hardware Description File (.hdf) which describes the hardware resources that were just built and will be used by PetaLinux to create a PS project.

To do so, Click on File -> Export -> Export Hardware. Make sure to check the “Include Bitstream” box when prompted.

Figure_9.png

At this point, a file named design_1_wrapper.hdf should be present in the <project_name>.sdk directory within your Vivado Project. Copy this file to a different location, where you will create a PetaLinux project.

PetaLinux Project

Now that we have a hardware design, a PetaLinux project needs to be created, which will result in an Embedded Linux application that will be saved on an SD card and used to boot the PS side of the ZCU102 board.

A PetaLinux 2018.3 environment in a Linux OS is needed for this process. For more information about installing PetaLinux, consult the document PetaLinux Tools Documentation: Reference Guide (UG1144).

In a clean directory of your choice, create a PetaLinux project. In this example, the directory is named 'my_proj'.

[host] /my_proj $ petalinux-create -t project -n XVC -s Petalinux-v2018.3/bsp/release/Xilinx-zcu102-v2018.3-final.bsp

Note: The Board Support Package (.bsp file) is a collection of libraries and drivers that will form the lowest layer of your application software stack and is specific to the ZCU102 board.

The BSP files for Xilinx boards are normally included with the PetaLinux installation. The one used in this example design is also attached to this post.

We now have a PetaLinux project named XVC and a directory of the same name has been created.

Enter the XVC directory and run the command below:

[host] /my_proj/XVC $ petalinux-config –get-hw-description=../<location for the file design_1_wrapper.hdf>

This command will link the PetaLinux project with the resources we enabled in the design. After a few seconds, the message and window below will be shown.

Figure_10.png

Figure_11.png

Navigate to the option DTG Settings -> Kernel Bootargs. Uncheck the option generate boot args automatically.

Move down and select the user option set kernel bootargs. This will allow us to enable/configure custom boot arguments (bootargs) for the device.

The window below should be shown:

Figure_12.png

Insert the following arguments and click OK when done:

earlycon console=ttyPS0,115200 clk_ignore_unused uio_pdrv_genirq.of_id=generic-uio 

Save and exit the Menu. The messages below should be displayed.

Figure_13.png

Once the system configuration process is complete, run the command below to start the kernel configuration for the embedded OS:

[host] /my_proj/XVC $ petalinux-config -c kernel

The Kernel creation and initialization will start, and the messages below will be issued:

Figure_14.png

When the initialization is complete, a configuration menu will be shown:

Figure_15.png

Navigate to Device Drivers -> Userspace I/O drivers.

Check the two options as in the image below, which will allow the UIO drivers to be included in the kernel.

Figure_16.png

Return to the initial menu and navigate to CPU Power Management -> CPU Idle.

Make sure that “CPU idle PM support” is unchecked.

Figure_17.png

Save and exit the menu. The kernel configuration process will continue and will issue the following message when completed.

“[INFO] successfully configured kernel”

Once this process is completed, we need to modify the device-tree file to allow the Debug Bridge to communicate in the Linux uio space.

Access the file project-spec/meta-user/recipes-bsp/device-tree/files/system-user.dtsi and modify it with the following configs:

Figure_18.png

We can now create a PetaLinux application (in this case, the XVC server app) to be included in the boot image.

Issue the following command:

[host] /my_proj/XVC $ petalinux-create -t apps -n xvcserver --template c --enable

This command will create the template for the xvcserver application, including a “hello world” template in C.

Navigate to the directory project-spec/meta-user/recipes-apps/xvcserver/files and verify that it contains a file named xvcserver.c.

Copy the XVC Server code attached to this blog post and replace the xvcserver.c template. The XVC code provided in this blog is similar to the one used for the XVC applications in Zynq®-7000 devices (As explained in the document XAPP1251) but has been modified to work with Zynq UltraScale+ devices.

Note: In the new xvcserver.c file, make sure that the correct UIO (most likely /dev/uio1) is opened as a file pointer in the code.

The original code for Zynq-7000 devices uses uio0, but for MPSoC devices, uio1 must be used.

Finally, we are ready to build the complete PetaLinux Image by running the command below:

[host] /my_proj/XVC $ Petalinux-build

The build process can take several minutes and will issue several messages as in the screen capture below.

Ignore the error about failing to start the TFPT process, as we are not configuring the device to support such feature.

Figure_19.png

This process creates several ELF files in the directory images/linux. The last step is to consolidate these files into a boot file to be saved to an SD Card and inserted into the ZCU102.

Issue the following command, making sure to point to the correct location for the files zynqmp_fsbl.elf, u-boot.elf, pmufw.elf and design_1_wrapper.bit.

[host] /my_proj/XVC $ petalinux-package --boot --fsbl images/linux/zynqmp_fsbl.elf --u-boot images/linux/u-boot.elf --pmufw images/linux/pmufw.elf --fpga ../../hw/design_1_wrapper.bit

The process will issue several messages and will successfully end with the following message:

“INFO: Binary is ready”.

Figure_20.png

Navigate to the directory /XVC/images/linux and verify that the files BOOT.BIN and Image.ub are present.

Figure_21.png

Copy these two files to an SD Card. Make sure that the SD card is formatted in FAT32 format.

ZCU102 Booting

Insert the SD card into the ZCU102. Make sure to also set the Boot Mode Pins (SW6) to SD card, which is On-Off-Off-Off.

In addition, connect an Ethernet cable to the board so that it can be connected to your intranet.

Finally, plug a Digilent cable into the ZCU102 UART port in order to verify if the boot was successful and perform runtime configurations.

Power on the ZCU102 and using TeraTerm (or your preferred terminal application) connect to the Serial Port COM5. Once connected, make sure to change the connection baud rate to 115200.

On TeraTerm, you should now be able see the messages related to the boot process (If necessary, reset the ZCU102 so that all of the messages are displayed).

Once the boot process is finished, the root credentials will be required to access the system. Use the following default credentials:

Username: root
Pass: root

Once you have access to the system, the first step is to verify if the Debug Bridge has gained communication via UIO, as configured in the PetaLinux project. Issue the following command and verify if debug_bridge is returned:

root@xilinx-zcu102-2018_3:~# cat /sys/class/uio/uio1/name
debug_bridge

The connection to the XVC application will be done via TCP/IP protocol, so it is necessary to know the IP address for the ZCU102 board.

Issue the following command and take note of the IP address. In our project, the IP address obtained is 172.20.9.168.

root@xilinx-zcu102-2018_3:~# ifconfig

Figure_22(2).png

We are now ready to start the XVC application and connect to it remotely.

Start the application with the following command:

root@xilinx-zcu102-2018_3:~# xvcserver

 

Connecting to the XVC Server

Open a Vivado Hardware Server 2018.3 in your local host.

Select the options Open Target -> Open New Target. Once the wizard opens, select 'Connect to Local Server'.

In the Select Hardware Target window, click on the button “Add Xilinx Virtual Cable (XVC)”.

Figure_23.png

The following window will appear. Add the IP address for your ZCU102, leave the Port Number as 2542, and click OK.

Figure_24(2).png

A virtual cable is now available for connection as a Hardware Target and we can observe the Debug Bridge available as a Hardware Device in the JTAG chain. Hit next and finish the connection to this target.

Figure_25(2).png

In the TeraTerm terminal (if you still have it open), we can verify that a connection has been accepted.

Figure_26.png

In the Hardware Manager, we can now see a connection to the target 172.20.9.168, which contains a Debug Bridge device and an ILA available.

Because we have not provided the LTX file for the ILA, none of the nets are being displayed yet. In the lower-right window, click on the blue link “Specify the probes file and refresh the device”.

Figure_27.png

When prompted, provide the location for the file design_1_wrapper.bit, which was generated by Vivado. It should be located in the directory <vivado_project>/<vivado_project>.runs\impl_1

At this point, the ILA probes can be seen (in this example, we only have an 8-bit probe named LEDS), but no waveform can be seen yet. Expand the LED's probes and hit the ILA button “Trigger Immediate” in order to load the waveforms.

Figure_28.png

JTAG Fallback

In the first chapter of this blog, when creating the PL design, we configured one of the Debug Bridges to provide a JTAG Fallback option, which can be used to locally debug the ZCU102 board in case the XVC application stops working.

To test and verify this feature in action, while still connected to the XVC from the previous step, connect a Digilent JTAG cable to the JTAG port in the ZCU102 board and to your host.

The Hardware tab will be automatically refreshed, and two targets should now be available (the original XVC target and a new Digilent target, which is initially closed).

Figure_29.png

Because we are simulating that the XVC target is not responding, close this target and open the new Digilent target. You might see a warning stating that “closing the target will cause one or more waveform windows to close”. Click OK to accept it.

Once connected to the Digilent target, we should again see the black canvas and the option to specify the ILA probes.

Now that we are locally connected, the SysMon and arm_dap devices are also available.

Figure_30.png

Provide the location for the file design_1_wrapper.bit, refresh the target, and click the “run Trigger Immediate” button.

Observe that we can again visualize the ILA waveforms, but now from a local connection.

Figure_31.png

References