UPGRADE YOUR BROWSER

We have detected your current browser version is not the latest one. Xilinx.com uses the latest web technologies to bring you the best online experience possible. Please upgrade to a Xilinx.com supported browser:Chrome, Firefox, Internet Explorer 11, Safari. Thank you!

Adam Taylor’s MicroZed Chronicles, Part 125: Software for the Avnet EVK (Embedded Vision Kit)

by Xilinx Employee on ‎04-04-2016 11:15 AM (8,799 Views)

 

By Adam Taylor

 

 

Last week, we built the hardware for our vision system from the Avnet EVK (Embedded Vision Kit). This week, we need to create a software application that will configure the peripherals as we require them. The software we want to develop must configure the following:

 

  • The Python 1300C camera using its SPI interface
  • The AXI Python 1300C interface module
  • The AXI VDMA, to read from and write frames to DDR memory
  • The AXI Color Filter Array
  • The HDMI Output device for the AD7511
  • The I2C Mux and its attached peripherals of interest
  • A further I2C mux located on the On Semi camera module, which is used for power control

 

One of the main aspects of the software is EVK configuration using I2C. The MicroZed’s I2C interface in the Zynq SoC’s PS (processor system) is connected to an I2C multiplexer on the mated EVCC (Embedded Vision Carrier Card). This mux allows us to drive as many as six connected peripherals on the EVCC. In this application, we will need to configure the HDMI TX and the I2C expander. A schematic and the memory addresses of the EVCC and MicroZed are shown below.  

 

 

Image1.jpg

 

EVCC I2C structure

 

 

Image2.jpg

 

 

EVCC I2C Memory Map

 

 

The camera module also has an I2C IO expander in it. This module enables the CMOS sensor’s power rails and is connected to the Camera Interface of the I2C Mux on the EVCC as shown below.

 

 

Image3.jpg

 

Camera Module I2C Expander

 

 

We will need software to control the I2C mux and the downstream module. The AVNET GitHub helpfully provides us with all the files we need to quickly and easily get this design up and running. All we need to do is pull together these files and write the main program, which uses the provided API’s. The files we need from the Avnet GitHub are both the header and the source code files as follows:

 

  • ADV7511 – API for the HDMI output
  • CAT9554 – API for the I2C I/O expander on the camera module
  • TCA9548 – API for the I2C mux on the EVCC
  • PCA9534 – API for the I2C IO expander on the EVCC
  • OnSemi_Python_SW – API for the Python 1300C
  • XAXIVDMA_EXT – API for configuring the VDMA
  • XIICPS_EXT – API for driving the external I2C

 

Once we include these files within our project, all we have to do is initialize the peripherals as we would do normally within any application. We then use the API to configure how we want our solution to function.

 

The next step is to correctly configure the I2C IO expander. Its power-up default is for the mux I/O pins to be set as inputs and we require them to be outputs. So the first thing we need to do is issue a command to set these pins as outputs. We can do this very simply issuing the following command, which writes a 0 to the I2C IO expander configuration register thus setting the pin as an output:

 

status = 0x08;

pca9534_set_pins_direction(pdemo->piicps0, status);

 

(See this document for more info.)

 

Once the I2C I/O expander’s pins have been set to output mode, we need to cancel the I2C reset by setting that pin high and we need to prevent HDMI output power down. We can again use very simple commands to perform these operations:

 

pca9534_set_pin_value(pdemo->piicps0, 4, 0);

 

The next step is to power up the CMOS Imager on the Camera Module. To do this, we need to set the path on the I2C mux to the Camera IO module and then use the CAT9554 I2C I/O Expander API to sequence the power supplies.

 

We can control the CMOS imager power supplies on and off using the functions below:

 

cat9554_vdd18_en(pdemo->piicps0);

cat9554_vdd33_en(pdemo->piicps0);

cat9554_vddpix_en(pdemo->piicps0);

 

 

These commands turn on the power supplies. By replacing _en with _off, we can turn off power to the CMOS Imager. We also need to use the usleep() function to ensure correct command sequencing.

 

 

We can configure the CMOS sensor after powering it up. In this instance—we will look at others in more detail in a subsequent blog—we initialize the sensor using this command:

 

 

onsemi_python_sensor_initialize(pdemo->pPython_receiver, SENSOR_INIT_ENABLE, 0);

 

 

 

At this point, you should note that the provided API saves us a significant amount of effort when it comes setting up the Python imager. If we had to write this code from scratch, we’d need to expend considerably more time to write, test, and debug this code.

 

 

The command above runs the power-on sequences required to enable the image senor for use but it does not initiate an image stream from the device. To do that we must call the same function which enables the stream:

 

 

onsemi_python_sensor_initialize(pdemo->pPython_receiver, SENSOR_INIT_STREAMON, 0);

 

 

Two final steps required to configure the CMOS imager are to enable correlated double sampling, which is used to generate the actual pixel values, and to set the horizontal sync timing.

 

 

With that completed, we are finally free to configure the color filter array to handle the initial RGRG-ordered pixels from the imager, as shown in the illustration below.

 

 

 Image4.jpg

 

 

RGRG means that the first pixels received from the imager in the first line of each frame (starting with pixel (0;0)) are red, green, then red, green moving along the x axis as shown in the above diagram. The color filter array needs to know the starting color value so that it can correctly create the image.

 

 

Finally, we configure the VDMA and start the transfers. You should see the image appear on your screen after a few seconds, proving that the camera is working. You can see my uploaded application on my GitHub account.

 

 

This short blog just scratches the surface of the Python 1300C’s capabilities. Next week, we will look at how we can drive this imager for more for customized applications.

 

 

 

The code is available on Github as always.

 

If you want E book or hardback versions of previous MicroZed chronicle blogs, you can get them below.

 

 

 

  • First Year E Book here
  • First Year Hardback here.

 

 

 

MicroZed Chronicles hardcopy.jpg 

 

 

  • Second Year E Book here
  • Second Year Hardback here

 

 

MicroZed Chronicles Second Year.jpg 

 

 

 

You also can find links to all the previous MicroZed Chronicles blogs on my own Web site, here.

 

 

 

Labels
About the Author
  • Be sure to join the Xilinx LinkedIn group to get an update for every new Xcell Daily post! ******************** Steve Leibson is the Director of Strategic Marketing and Business Planning at Xilinx. He started as a system design engineer at HP in the early days of desktop computing, then switched to EDA at Cadnetix, and subsequently became a technical editor for EDN Magazine. He's served as Editor in Chief of EDN Magazine, Embedded Developers Journal, and Microprocessor Report. He has extensive experience in computing, microprocessors, microcontrollers, embedded systems design, design IP, EDA, and programmable logic.