08-29-2020 11:04 PM
I have a Digilent Nexys Video Artix 7 board: https://store.digilentinc.com/nexys-video-artix-7-fpga-trainer-board-for-multimedia-applications/
I want to design a system that can stream video input from a camera and do color segmentation or edge detection. I do not have a camera and I need advice for a general workflow/block diagram so that I can get started. The questions that come to mind are the following:
1. What are your recommendations for a camera? I know there are some that are MIPI CSI-2 and USB (Cameralink and GigE might be a bit complex but I'm open for suggestions). I'd like to pay under $100.
2. How would I go about finding the device video stream? For example, if I had a generic USB webcam I would expect that I would need to be running Petalinux on Microblaze to get the stream. Is this true? How would you recommend getting the video stream in Microblaze? Would it be easier to skip Microblaze?
3. Could I first use QEMU to test the Microblaze code assuming I should be using Microblaze or will this not be able to acquire the video stream from my host computer?
4. My thought was to use video streaming drivers on Microblaze to plug in a USB camera to acquire the images and DMA them to the FPGA logic. I want to write edge detection or some other image processing in the fabric.
5. Where should I start with any of these image processing algorithms? Should I write this in C and have HLS generate it? Maybe Vivado or Vitis would have blocks to do this?
Thanks in advance for any help that you can provide.
09-02-2020 02:50 PM
Hi @doverstreet6,
My first thought when someone is creating a video project targeting an A-7 is that they need to be very conscious about the footprint of their design.
Luckily the A-7 used on the Nexys board is the largest of the A-7 devices.
That being said, you will still have limited space and need to keep that in mind when designing.
1. What are your recommendations for a camera? I know there are some that are MIPI CSI-2 and USB (Cameralink and GigE might be a bit complex but I'm open for suggestions). I'd like to pay under $100.
I am not very familiar with USB cameras or the USB interface. USB is generally handled through the Embedded forums and is not categorized as a Video IP. For a MIPI camera, I don't believe you have the required interface unless you use a FMC daughter card.
An example of a kit for MIPI without the daughter card is the Spartan-7 SP701.
Based on your price range, USB may be your best option. For USB IP information, please ask on the Embedded forum.
2. How would I go about finding the device video stream? For example, if I had a generic USB webcam I would expect that I would need to be running Petalinux on Microblaze to get the stream. Is this true? How would you recommend getting the video stream in Microblaze? Would it be easier to skip Microblaze?
Please ask Linux based questions on the Embedded forum, but I believe once you have Linux booted you can query mounted devices.
3. Could I first use QEMU to test the Microblaze code assuming I should be using Microblaze or will this not be able to acquire the video stream from my host computer?
Please post this to the Embedded forum.
4. My thought was to use video streaming drivers on Microblaze to plug in a USB camera to acquire the images and DMA them to the FPGA logic. I want to write edge detection or some other image processing in the fabric.
This is a good idea, I believe the USB driver is only written for pettalinux and not for bare-metal, meaning you would need to run petalinux and baremetal, which may be hard to do with microblaze on an A-7. Again, I believe for a better answer check with the embedded team.
5. Where should I start with any of these image processing algorithms? Should I write this in C and have HLS generate it? Maybe Vivado or Vitis would have blocks to do this?
There are many different ways of doing this, see an Adam Taylor blog for HLS.
To summarize, what you are trying to do will either be a lot of work, or more expensive than you are envisioning.
You will need to look at all of your options and see which ones fit your end goal.
Here are the questions you still need to answer:
-Sam
09-02-2020 03:11 PM
Do you have acknowledgement about embedded linux, MMU and video stream ?
If no and you pay more money ex. around $150 or $250, I recommend the followings.
You can use gstreamer to achieve what you want to do.
$150
$250
https://www.avnet.com/wps/portal/us/products/new-product-introductions/npi/aes-ultra96-v2/
I think that it's hard to achieve what you want to do without acknowledgment of embedded linux with Nexys Video Artix 7 board.
Best regards,
09-03-2020 07:31 AM
Thank you for your detailed response.
1. I could spend more if it's necessary, but I just got the Nexys Video board during Cyber Week last year. I just finished grad school this month, so I haven't had time to work with it until now. I figured with it being called a Video board it could handle image processing. I would certainly switch to a different interface than USB for a camera if it will significantly reduce the time spent. I just felt that webcam quality was all I needed and they were cheap, so I'd start there and if I really got into it then it would warrant spending more on it.
1. I have not found bare metal drivers for a USB webcam. I assumed they would not exist, but maybe I am wrong. I will look more into this.
2. I have a ZCU102 board at work and have built/used Petalinux 2017.4 to use the SPI interface. Petalinux is a real pain, but if that's all I have as an option then I'll do it again if need be. This project is mostly a resume building project so that I can demonstrate as many areas of proficiency as possible.
2. For class I used OpenCV to do edge detection. I was assuming I could get OpenCV to work on Petalinux. I believe I saw a couple of forum posts where people had it working. If it's not implemented then I will write my own algorithm. There should be examples on how to do this. I am just not proficient in VHDL or Verilog, but if it comes to Python/C/C++ then I can do it easily. I would eventually like to implement it by hand in VHDL or Verilog but I know that will be a long process. I am trying to start simple.
3. I have no clue if I have space. I figured if I literally only put whatever USB IP needed (or maybe USB you just select the input pins and route those directly to the Microblaze. never used it so I don't know) and the Microblaze then I should have enough room. I could be very wrong though. What do you think? If there is an easy route by skipping Microblaze I am very open to doing it. Even if it requires VHDL or Verilog. It will just be a large time sink for the time being if I have to go straight to it. If that's the best approach then I will do it regardless.
4. I am doing this in my free time, so there is no deadline. I was hoping to be able to do it in maybe 100 hours. My very high level goal is to demonstrate some level of knowledge on my Git repository for Xilinx products for computer vision. With this approach I thought I could show the following:
Thank you for taking the time to respond. I appreciate your insight.
09-03-2020 07:41 AM - edited 09-03-2020 07:42 AM
I have done embedded linux like VxWorks 7, FreeRTOS, Petalinux, but not much Petalinux. I am not familiar with directly interfacing the MMU or Video Stream. Gstreamer sounds like what I'm looking for thank you for mentioning that to me.
I had not seen those two boards. They are very good options if I can't make it work with my current board. Thanks for the suggestions!
09-03-2020 08:47 AM
Hi @doverstreet6,
As @watari mentioned this will likely much easier with a Zynq part vs a A-7 part.
Also to address 2017.4, I would say that in the last 3 years that the Linux solution from Xilinx and the Linux ecosystem has been much improved and I would point you to use the latest releases.
For an example of USB with Gstreamer, take a look at the ZCU102 TRD.
-Sam