05-02-2017 07:28 AM - edited 05-02-2017 07:55 AM
I'm trying to work out what would be the better way to do a proper camera calibration and apply it to a zynq device in a video standalone application, using the Vivado and HLS tools.
At the end I'd like to use similar approach used in this standard opencv Camera Calibration Tutorial.
Having a look to functions available at the HLS video library , it seems that there's functions for corner detection which could be use for the chessboard corner detection step on the camera calibration tutorial but, looks like there's not an equivalent function to get the camera calibration coefficients which is the final step of the calibration process.
The goal of this process is to get the camera parameters and then use them to correct the camera distortions and there's a function available in the HLS library to achieve this: hls::InitUndistortRectifyMap
Then to get the camera calibration and correction with the zynq device (please correct me if I'm wrong) the best way would be to use the standard procedure and opencv libraries (getting the camera stream or pictures in the PC), get the output parameters form the opencv cameracalibration function and then use them later with the InitUndistortRectifyMap function.
Anybody have done this before? would it be the correct approach?
For other hand, having a look to the InitUndistortRectifyMap arguments:
void hls::InitUndistortRectifyMap( hls::Window<3, 3, CMT> cameraMatrix, DT (&distCoeffs)[N], hls::Window<3, 3, RT> R, hls::Window<3, 3, CMT> newcameraMatrix, hls::Mat<ROWS, COLS, MAP1_T>& map1, hls::Mat<ROWS, COLS, MAP2_T>& map2 );
And having the output from the opencv function:
calibrateCamera(object_points, image_points, image.size(), intrinsic, distCoeffs, rvecs, tvecs);
I'm trying to match and understand the arguments between both, and the only ones I've got clear are:
intrinsic => cameraMatrix (intrinsic matrix)
distCoeffs => DT (distortion coefficients)
I'm not able to see clear how to use the rest of arguments, the opencv function gives two vectors one for rotation (rvecs) and another one for translation (tvecs) but the HLS function expects a rotation 3x3 matrix (R) and a 3x3 newcameraMatrix ( input matrix representing the camera in the new coordinate system) ...
Anybody have an idea how to use properly the InitUndistortRectifyMap arguments?
I had a look to the documentation but I could not see more than a brief definition, is there any example or more detailed description?
Thanks in advance.
05-17-2017 06:19 PM
I'm doing the camera calibration at work. first get the parameters from opencv, such as cameramatrix, distcoff, newcameramatrix ect. Then use them in this function. HLS function initUndistortRetifyMap and remap seem to work on synthesis. but when I compare the hmap1 and hmap2 between opencv and HLS, opencv supports negative numbers but HLS doesn't. In HLS document, it says hmap1 datatype only supports HLS_16SC2. but in the library, hls_video_undistort.h, hmap1 (internally u,v) is defined as ap_ufixed..., that is unsigned. Other than negative numbers, hmap1 number match between opencv and HLS. I'm wondering if anyone else has seen this. In order to match opencv, I will make a customized module to define u,v as ap_fixed instead. I'm using vivado HLS 2016.4. Is this a bug in HLS video library?
08-22-2017 10:57 PM
I am using HLS::initUndistortRectifyMap() for stereo calibration too. Not notice the negative value issue, but the calibrated image pair are definitely not horizontal matching, so there must be some problem.
08-29-2017 07:11 AM
I also got similar issue. My camera matrix and distortion coefficients have floating point numbers. That was giving issue while assigning u and v variables which are of type ap_ufixed. I changed it to float. And it was working.
11-01-2017 01:54 AM
we adopt InitUndistortRectifyMap and remap functions to rectify pictures. and the rectified image is not smooth, lines have Serrated shape. I want to know if you encountered resources too larger(such as: RAM, DSP and LUT use) problem, like following picture display:
Is this resources can reduce ? and how to?
Thanks in advance.