cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Adventurer
Adventurer
467 Views
Registered: ‎05-04-2018

Microblaze Video Stream Concept

I have a Digilent Nexys Video Artix 7 board: https://store.digilentinc.com/nexys-video-artix-7-fpga-trainer-board-for-multimedia-applications/

I want to design a system that can stream video input from a camera and do color segmentation or edge detection. I do not have a camera and I need advice for a general workflow/block diagram so that I can get started. The questions that come to mind are the following:

1. What are your recommendations for a camera? I know there are some that are MIPI CSI-2 and USB (Cameralink and GigE might be a bit complex but I'm open for suggestions). I'd like to pay under $100.

2. How would I go about finding the device video stream? For example, if I had a generic USB webcam I would expect that I would need to be running Petalinux on Microblaze to get the stream. Is this true? How would you recommend getting the video stream in Microblaze? Would it be easier to skip Microblaze?

3. Could I first use QEMU to test the Microblaze code assuming I should be using Microblaze or will this not be able to acquire the video stream from my host computer?

4. My thought was to use video streaming drivers on Microblaze to plug in a USB camera to acquire the images and DMA them to the FPGA logic. I want to write edge detection or some other image processing in the fabric.

5. Where should I start with any of these image processing algorithms? Should I write this in C and have HLS generate it? Maybe Vivado or Vitis would have blocks to do this?

 

Thanks in advance for any help that you can provide.

Tags (1)
0 Kudos
Reply
5 Replies
Moderator
Moderator
380 Views
Registered: ‎10-04-2017

Hi @doverstreet6,

 

My first thought when someone is creating a video project targeting an A-7 is that they need to be very conscious about the footprint of their design.

Luckily the A-7 used on the Nexys board is the largest of the A-7 devices.

samk_0-1599079337196.png

 

That being said, you will still have limited space and need to keep that in mind when designing.

1. What are your recommendations for a camera? I know there are some that are MIPI CSI-2 and USB (Cameralink and GigE might be a bit complex but I'm open for suggestions). I'd like to pay under $100.

I am not very familiar with USB cameras or the USB interface. USB is generally handled through the Embedded forums and is not categorized as a Video IP. For a MIPI camera, I don't believe you have the required interface unless you use a FMC daughter card.
An example of a kit for MIPI without the daughter card is the Spartan-7 SP701

Based on your price range, USB may be your best option. For USB IP information, please ask on the Embedded forum

 

2. How would I go about finding the device video stream? For example, if I had a generic USB webcam I would expect that I would need to be running Petalinux on Microblaze to get the stream. Is this true? How would you recommend getting the video stream in Microblaze? Would it be easier to skip Microblaze?

Please ask Linux based questions on the Embedded forum, but I believe once you have Linux booted you can query mounted devices. 

3. Could I first use QEMU to test the Microblaze code assuming I should be using Microblaze or will this not be able to acquire the video stream from my host computer?

Please post this to the Embedded forum.

4. My thought was to use video streaming drivers on Microblaze to plug in a USB camera to acquire the images and DMA them to the FPGA logic. I want to write edge detection or some other image processing in the fabric.

This is a good idea, I believe the USB driver is only written for pettalinux and not for bare-metal, meaning you would need to run petalinux and baremetal, which may be hard to do with microblaze on an A-7. Again, I believe for a better answer check with the embedded team.

5. Where should I start with any of these image processing algorithms? Should I write this in C and have HLS generate it? Maybe Vivado or Vitis would have blocks to do this?

There are many different ways of doing this, see an Adam Taylor blog for HLS. 



To summarize, what you are trying to do will either be a lot of work, or more expensive than you are envisioning. 

You will need to look at all of your options and see which ones fit your end goal. 

Here are the questions you still need to answer:

  1. Which camera can you afford?
    1. If you can only afford a USB camera, are there baremetal drivers available?
    2. If only petalinux drivers are available, do you have the resources and time to implement petalinux on a A-7 microblaze?
  2. Once you have the camera figured out, how will you do edge detection?
    1. Will you take something already written or implement something yourself?
  3. Do you have space in the device for everything you are looking to accomplish?
  4. What is your timeline for this project? Does the above fit into what you are envisioning?
    1. If not, is there a way to update the goals of the project to reflect the budget and time available.

 

-Sam

 

 

 

 

Don't forget to reply, kudo, and accept as solution.

Xilinx Video Design Hub
Teacher
Teacher
376 Views
Registered: ‎06-16-2013

Hi @doverstreet6 

 

Do you have acknowledgement about embedded linux, MMU and video stream ?

If no and you pay more money ex. around $150 or $250, I recommend the followings.

You can use gstreamer to achieve what you want to do.

 

$150

https://shop.trenz-electronic.de/de/TE0726-03M-ZynqBerry-Modul-mit-Xilinx-Zynq-7010-in-Raspberry-Pi-Formfaktor

 

$250

https://www.avnet.com/wps/portal/us/products/new-product-introductions/npi/aes-ultra96-v2/

 

I think that it's hard to achieve what you want to do without acknowledgment of embedded linux with Nexys Video Artix 7 board.

 

Best regards,

Adventurer
Adventurer
338 Views
Registered: ‎05-04-2018

@samk 

 

Thank you for your detailed response.

1. I could spend more if it's necessary, but I just got the Nexys Video board during Cyber Week last year. I just finished grad school this month, so I haven't had time to work with it until now. I figured with it being called a Video board it could handle image processing. I would certainly switch to a different interface than USB for a camera if it will significantly reduce the time spent. I just felt that webcam quality was all I needed and they were cheap, so I'd start there and if I really got into it then it would warrant spending more on it.

        1. I have not found bare metal drivers for a USB webcam. I assumed they would not exist, but maybe I am wrong. I will look more into this.

        2. I have a ZCU102 board at work and have built/used Petalinux 2017.4 to use the SPI interface. Petalinux is a real pain, but if that's all I have as an option then I'll do it again if need be. This project is mostly a resume building project so that I can demonstrate as many areas of proficiency as possible.

2. For class I used OpenCV to do edge detection. I was assuming I could get OpenCV to work on Petalinux. I believe I saw a couple of forum posts where people had it working. If it's not implemented then I will write my own algorithm. There should be examples on how to do this. I am just not proficient in VHDL or Verilog, but if it comes to Python/C/C++ then I can do it easily. I would eventually like to implement it by hand in VHDL or Verilog but I know that will be a long process. I am trying to start simple.

3. I have no clue if I have space. I figured if I literally only put whatever USB IP needed (or maybe USB you just select the input pins and route those directly to the Microblaze. never used it so I don't know) and the Microblaze then I should have enough room. I could be very wrong though. What do you think? If there is an easy route by skipping Microblaze I am very open to doing it. Even if it requires VHDL or Verilog. It will just be a large time sink for the time being if I have to go straight to it. If that's the best approach then I will do it regardless.

4. I am doing this in my free time, so there is no deadline. I was hoping to be able to do it in maybe 100 hours. My very high level goal is to demonstrate some level of knowledge on my Git repository for Xilinx products for computer vision. With this approach I thought I could show the following:

  • Petalinux with Microblaze, installing drivers to it, writing code on it, emulating in QEMU
  • Vivado workflow in the sense I can write constraints and interface with a Microblaze and external peripherals
  • Simple CV algorithms
  • Real time algorithms (maybe 30 fps)
  • Maybe HLS or VHDL/Verilog at a beginner level

 

Thank you for taking the time to respond. I appreciate your insight.

 

 

0 Kudos
Reply
Adventurer
Adventurer
334 Views
Registered: ‎05-04-2018

@watari 

 

I have done embedded linux like VxWorks 7, FreeRTOS, Petalinux, but not much Petalinux. I am not familiar with directly interfacing the MMU or Video Stream. Gstreamer sounds like what I'm looking for thank you for mentioning that to me.

 

I had not seen those two boards. They are very good options if I can't make it work with my current board. Thanks for the suggestions!

0 Kudos
Reply
Moderator
Moderator
325 Views
Registered: ‎10-04-2017

Hi @doverstreet6,

 

As @watari mentioned this will likely much easier with a Zynq part vs a A-7 part.

Also to address 2017.4, I would say that in the last 3 years that the Linux solution from Xilinx and the Linux ecosystem has been much improved and I would point you to use the latest releases.

For an example of USB with Gstreamer, take a look at the ZCU102 TRD.

samk_0-1599148009017.png

 

 

-Sam

 

Don't forget to reply, kudo, and accept as solution.

Xilinx Video Design Hub
0 Kudos
Reply