Giter Site home page Giter Site logo

omnidirectional_calibration's Introduction

Omnidirectional camera calibration toolbox

This package consists of three parts.

  • omnidirectional camera calibration
  • random pattern calibration object
  • multiple camera calibration

Requires

  • OpenCV 3.0

This package is selected as a project in GSoC 2015 of OpenCV, mentored by Bo Li. This repository is a standalone project during my development, you may notice that the directory structure is the same as OpenCV. Now, it is merged into OpenCV's opencv_contrib repository. You can use it by compiling OpenCV with opencv_contrib or adding source files and include files in this repository to your project.

Usage

Add files in src and include to where they should be, it depends on your development environment.

For omnidirectional camera calibration, use function cv::omnidir::calibrate in omnidir.cpp. This is API is fully compatible with OpenCV's cv::calibrateCamera.

For random pattern calibration object, use class RandomPatternCornerFinder in randpattern.cpp.

For multiple camera calibration, use class MultiCameraCalibration in multicalib.cpp. So far, multiple camera calibration only support random pattern object. The name of calibration images are required to be "cameraIdx-timestamp.*" and cameraIdx starts from 0. Other name rules will not work!

Samples

In the samples directory, there are several samples that describe how to use these APIs. These samples are something like samples in OpenCV, i.e., they are some programs that can be directly used for your application like calibration. Data to be used to run samples is in data directory.

Tutorial

In the tutorial directory, step by step tutorials are given.

A Video

This is a video that shows some results of this package.

Future Work

Revise the API of multiple camera calibration to be more general so that object points and image points from each camera and view are given.

omnidirectional_calibration's People

Contributors

jiuerbujie avatar prclibo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

omnidirectional_calibration's Issues

Unable to use it with OpenCV 4.5.0

I am using custom pattern image to get calibration result using this library along with OpenCV 4.5.0 and I am getting following error.

number of matched points 201
number of filtered points 82
OpenCV: terminate handler is called! The last OpenCV error is:
OpenCV(4.5.0) Error: Assertion failed (nimages > 0) in calibrateCameraRO, file C:\OpenCV\opencv\sources\modules\calib3d\src\calibration.cpp, line 3694

#include "opencv2/ccalib/randpattern.hpp"
#include "opencv2/highgui.hpp"
#include "opencv2/imgproc.hpp"
#include "opencv2/calib3d.hpp"
#include <vector>
#include <iostream>
#include <time.h>
using namespace std;
using namespace cv;

static void saveCameraParams(const string &filename, Size imageSize, float patternWidth,
                             float patternHeight, int flags, const Mat &cameraMatrix, const Mat &distCoeffs,
                             const vector<Mat> &rvecs, const vector<Mat> &tvecs, double rms)
{
    FileStorage fs(filename, FileStorage::WRITE);
    time_t tt;
    time(&tt);
    struct tm *t2 = localtime(&tt);
    char buf[1024];
    strftime(buf, sizeof(buf) - 1, "%c", t2);

    fs << "calibration_time" << buf;

    if (!rvecs.empty())
        fs << "nframes" << (int)rvecs.size();

    fs << "image_width" << imageSize.width;
    fs << "image_height" << imageSize.height;
    fs << "pattern_width" << patternWidth;
    fs << "pattern_height" << patternHeight;

    fs << "flags" << flags;

    fs << "camera_matrix" << cameraMatrix;
    fs << "distortion_coefficients" << distCoeffs;

    fs << "rms" << rms;

    if (!rvecs.empty() && !tvecs.empty())
    {
        CV_Assert(rvecs[0].type() == tvecs[0].type());
        Mat bigmat((int)rvecs.size(), 6, rvecs[0].type());
        for (int i = 0; i < (int)rvecs.size(); i++)
        {
            Mat r = bigmat(Range(i, i + 1), Range(0, 3));
            Mat t = bigmat(Range(i, i + 1), Range(3, 6));

            CV_Assert(rvecs[i].rows == 3 && rvecs[i].cols == 1);
            CV_Assert(tvecs[i].rows == 3 && tvecs[i].cols == 1);
            //*.t() is MatExpr (not Mat) so we can use assignment operator
            r = rvecs[i].t();
            t = tvecs[i].t();
        }
        //cvWriteComment( *fs, "a set of 6-tuples (rotation vector + translation vector) for each view", 0 );
        fs << "extrinsic_parameters" << bigmat;
    }
}

int main(int argc, char const *argv[])
{
    randpattern::RandomPatternCornerFinder finder(6, 5.7, 20,0,1,1);
    Mat pattern = imread("PyramidPattern.jpg", IMREAD_GRAYSCALE);
    finder.loadPattern(pattern);
    Mat image = imread("PyramidPatternTest.jpg", IMREAD_GRAYSCALE);
    finder.computeObjectImagePointsForSingle(image);
    vector<Mat> objectPoints = finder.getObjectPoints();
    vector<Mat> imagePoints = finder.getImagePoints();

    Mat K;
    Mat D;
    vector<Mat> rvec, tvec;
    double rms = calibrateCamera(objectPoints, imagePoints, image.size(), K, D, rvec, tvec);
    saveCameraParams("out_camera_params.xml", image.size(), 6, 5.7, CALIB_FIX_PRINCIPAL_POINT, K, D, rvec, tvec, rms);

    return 0;
}

Marker Pattern
PyramidPattern

Test Image to
PyramidPatternTest

And whenever I trying to see value of objectPoints, imagePoints variable in debugger I am getting {...} instead collection of Mat inside vector.

Please help me to get rid of this issue.

Some questions

I have some questions :

  1. “-pw <pattern_width> # physical width of random pattern \n"”,the unit of measurement is mm? cm?
  2. the resulted calibrated matrix, eg, "camera_pose_4", respect R4 relative to R1 or R41?
  3. after Levenberg-Marquardt optimization, should R1 != E, but the resulted camera_pose_0 = E.
    @prclibo @jiuerbujie

How can I get the result described in omni_calibration tutorial

My System information (version)
OpenCV = 3.3
Operating System / Platform = Windows 10 64 Bit
Compiler = Visual Studio 2015

My Question
Hi ,dear, I want to calibrate omnidirectional camera,and rectify the original image(180 FOV),but I can't get the same result when I flow the tutorial described in this url https://docs.opencv.org/trunk/dd/d12/tutorial_omnidir_calib_main.html When I rectify the original image to cylindrical images to preserve all view,I find my result loss some filed of view.
Her is the code I used in my project.
`flags |= omnidir::CALIB_FIX_SKEW ;
flags |= omnidir::CALIB_FIX_CENTER;

//cv::InputArray Knew = Matx33f(new_size.width / 3.1415, 0, 0, 0, new_size.height / 3.1415, 0, 0, 0, 1);
//cv::omnidir::undistortImage(distorted, undistorted, K, D, xi, cv::omnidir::RECTIFY_CYLINDRICAL, Mat(), new_size);

Mat view, rview,map1, map2;
cv::InputArray R = Matx33d::eye();
cv::InputArray P = cv::getOptimalNewCameraMatrix(K, D, distorted.size(), 1, distorted.size(), 0);
cv::omnidir::initUndistortRectifyMap(K, D, xi, R,P, distorted.size(), CV_16SC2, map1, map2, cv::omnidir::RECTIFY_CYLINDRICAL);
view = imread("1.jpg", IMREAD_COLOR);
remap(view, rview, map1, map2, INTER_LINEAR);
imshow("Image View", rview);`

My result
my result and tutorial result are below.
Can you help me ? Many thanks!
result

About the depth calculation in omnidir.cpp

Thanks for your great contribution !

I am confused about your implementation of absolute depth calculation, which is different from the original paper.

In Line 1483 of omnidir.cpp, the depth is calculated as follows

float depth = float(baseline * f /realDis.at<float>(j, i));

but in the original paper 'Binocular Spherical Stereo', the absolute depth is calculated as follows:
image
where is the depth of left image.

By the way, another repository pdi which provides a solution to get the panorama depth image from a single fisheye stereo image pair, uses the above equation to calculate depth.

I am wondering why you adopted this implementation, am I missing something ? or are these two implementations equivalent ?

Looking forward to your reply !

Get disparity maps that aids in 3D Reconstruction

hey,

The aim of my project is to generate disparity maps from a stereo fisheye camera. I have calibrated my camera using the omnidirectional module. I am using the stereoreconstruct functions to generate the rectified images and then I use the post-filtering module by OpenCV for better disparity maps by filling the holes in the disparity map. The problem I am facing is that both the rectified images differ in size a little bit. (Using RECTIFY_LONGI) Due to which I am getting non zero disparity values at the portions which do not overlap in the left and right images. For example left image
d2
right image
d1
disparity image
dd

As you can see there are some non zero values in the disparity image like the portion outside the curved boundary of the cropped rectified images. This is causing me problems while performing 3D reconstruction of the disparity map as it also considers the portion outside the actual image. Generating the rectified images of the same cropping size for left and right images will do the trick for me.

Please let me know how I can proceed with it. Or if you could tell me how to fill the holes in the map created by stereoreconstruction function to get a smooth disparity map, that would be fine too.

Thanks

Can't compile samples

Hey! I'm not very C++ educated, and when I try to compile the sample files (like random_pattern_generator.cpp), I get the undefined reference error:
random_pattern_generator.cpp:(.text+0x2d8): undefined reference to 'cv::randpattern::RandomPatternGenerator::getPattern()'

Obviously the randpattern.hpp is in the include/opencv2/ccalib/ folder. The call I made was:
g++ random_pattern_generator.cpp -o rgen 'pkg-config --cflags --libs opencv'

On my system I have successfully built opencv and opencv_contrib. If I use the module ccalib from opencv_contrib, the error msg is the same. What am I doing wrong?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.