自動泊車(之三)車位線定位(視覺定位)

1 引言 
 在前篇博客中,小博介紹了車位檢測得一些方法,當車位識別出來其實就是將車位在圖像座標系中得座標(u,v)給找出來,但是給出圖像座標是無法提供給決策層去倒車入庫得,我們需要提供得是3D座標,接下來本篇略淺得講下視覺定位的知識,如何從魚眼相機2D圖像座標到3D座標。在看此文之前需要具備基本得相機模型知識,我在前面博文講過不少雙目相機標定得知識,可參考前面得知識。
本文參考博客

http://www.cnblogs.com/singlex/p/pose_estimation_1.html

一篇寫的非常實用得博客

首先我們看下魚眼圖像

2視覺定位基礎知識
首先我們知道車子是一個隨動座標系,與機器人相同,我們定義車子四輪中心爲世界座標系原點,則車位在世界座標系中有一個座標,記爲(X,Y,Z),同樣在相機爲原點得座標系中也有一個相機座標(x,y,z),我們需要獲得兩個座標系之間得關係,及求出兩個座標系得旋轉與平移向量。
利用多個控制點在三維場景中的座標及其在圖像中的透視投影座標即可求解出攝像機座標系與表示三維場景結構的世界座標系之間的絕對位姿關係,包括絕對平移向量t以及旋轉矩陣R,該類求解方法統稱爲N點透視位姿求解(Perspective-N-Point,PNP問題)。這裏的控制點是指準確知道三維空間座標位置,同時也知道對應圖像平面座標的點。對於透視投影來說,要使得PNP問題有確定解,需要至少三組控制點。
opencv中提供了接口函數,Opencv中PNP的求解函數

void solvePnP(InputArray objectPoints, InputArray imagePoints, InputArray cameraMatrix, InputArray distCoeffs, OutputArray rvec, OutputArray tvec, bool useExtrinsicGuess=false, int flags = CV_ITERATIVE)

Parameters:
objectPoints - 世界座標系下的控制點的座標,vector的數據類型在這裏可以使用
imagePoints - 在圖像座標系下對應的控制點的座標。vector在這裏可以使用
cameraMatrix - 相機的內參矩陣
distCoeffs - 相機的畸變係數
以上兩個參數通過相機標定可以得到。相機的內參數的標定以前博文已經給出過代碼
rvec - 輸出的旋轉向量。使座標點從世界座標系旋轉到相機座標系
tvec - 輸出的平移向量。使座標點從世界座標系平移到相機座標系
flags - 默認使用CV_ITERATIV迭代法

3 算法使用

(1) 求出 R T
   由於opencv2以上版本已經提供了pnp算法的api,所以使用pnp的難點變成了如何構造場景使得能使用PNP算法。目前我們使用的最簡單的方法就是使用五個點,四個點做爲標定,一個點作爲測試,使用物體方法使得場景中只出現這四個控制點。如圖所示,我在標定板上選出了四個點,同時確定他在圖像座標系中得座標u,v,與在車得世界座標系座標 X,Y,Z。我們通過求出兩個座標系之間得R,T
在這裏插入圖片描述
原理圖如下
在這裏插入圖片描述
(2)求出S,尺度

下式爲圖像座標與世界座標的轉換公式,第一個矩陣爲相機內參數矩陣,第二個矩陣爲相機外參數矩陣,該式中圖像座標已知,相機內參數矩陣通過標定已獲取,還欠缺s。
在這裏插入圖片描述
轉換公式轉換爲如下左式,其中M爲相機內參數矩陣,R爲旋轉矩陣,t爲平移矩陣,Z爲實際座標系原點與相機座標系原點在Z軸上的距離。左式進行轉換後可得出右式,在右式中,當R,M,t和Z已知時,s爲唯一變量,所以可以求得s
在這裏插入圖片描述
在這裏插入圖片描述

4 算法實現
二、程序實現

(1)計算參數s和旋轉平移矩陣,需要輸入一系列的世界座標系的點及其對應的圖像座標系的點
objectPoints與 imagePoints爲圖像座標與真實世界座標(2)根據輸入的圖像座標計算世界座標。

 //輸入參數
	Mat cameraMatrix = Mat(3, 3, CV_32FC1, Scalar::all(0)); /* 攝像機內參數矩陣 */
	Mat distCoeffs = Mat(1, 5, CV_32FC1, Scalar::all(0)); /* 攝像機的5個畸變係數:k1,k2,p1,p2,k3 */
	double zConst = 0;//實際座標系的距離
	
	//計算參數
	double s;
	Mat rotationMatrix = Mat (3, 3, DataType<double>::type);
	Mat tvec = Mat (1, 3, cv::DataType<double>::type);
	void calcParameters(vector<cv::Point2f> imagePoints, vector<cv::Point3f> objectPoints)
	{
		//計算旋轉和平移矩陣
		Mat rvec(1, 3, cv::DataType<double>::type);
		cv::solvePnP(objectPoints, imagePoints, cameraMatrix, distCoeffs, rvec, tvec);
		cv::Rodrigues(rvec, rotationMatrix);
		cv::Mat imagePoint = cv::Mat::ones(3, 1, cv::DataType<double>::type); //u,v=1,1,1

		//計算參數S
		cv::Mat tempMat, tempMat2;
		tempMat = rotationMatrix.inv() * cameraMatrix.inv() * imagePoint;
		tempMat2 = rotationMatrix.inv() * tvec;
		s = zConst + tempMat2.at<double>(2, 0);
		s /= tempMat.at<double>(2, 0);
	}

(2)根據輸入的圖像座標計算世界座標。接下來我們給入一個圖像座標,即會輸出真實世界中座標

Point3f getWorldPoints(Point2f inPoints)
	{
		cv::Mat imagePoint = cv::Mat::ones(3, 1, cv::DataType<double>::type); //u,v,1
		imagePoint.at<double>(0, 0) = inPoints.x;
		imagePoint.at<double>(1, 0) = inPoints.y;
		Mat wcPoint = rotationMatrix.inv() * (s * cameraMatrix.inv() * imagePoint - tvec);
		Point3f worldPoint(wcPoint.at<double>(0, 0), wcPoint.at<double>(1, 0), wcPoint.at<double>(2, 0));
		return worldPoint;
	}

求R T,旋轉角等完整代碼

#include <opencv2/opencv.hpp>
#include <iostream>
#include <ctype.h>
using namespace cv;
using namespace std;

int main()
{
    //圖像座標u,v與真實世界座標X,Y,Z
    Point2f point;
    vector<Point2f>objectPoints;
    vector<Point2f> boxPoints;
///////// Loading image
    Mat sourceImage = imread("/home/lenovo/CLionProjects/2D_3D/2.bmp");
    namedWindow("Source", 1);
    ///// Setting box corners
    /// /////one Point/////////////////////////////
    point = Point2f((float) 558, (float) 259); //640X480
    boxPoints.push_back(point);
    circle(sourceImage, boxPoints[0], 3, Scalar(0, 255, 0), -1, 8);

    /// /////two Point/////////////////////////////
    point = Point2f((float) 629, (float) 212); //640X480
    boxPoints.push_back(point);
    circle(sourceImage, boxPoints[1], 3, Scalar(0, 255, 0), -1, 8);

    /// /////three Point/////////////////////////////
    point = Point2f((float) 744, (float) 260); //640X480
    boxPoints.push_back(point);
    circle(sourceImage, boxPoints[2], 3, Scalar(0, 255, 0), -1, 8);


    /// /////four Point/////////////////////////////
    point = Point2f((float) 693, (float) 209); //640X480
    boxPoints.push_back(point);
    circle(sourceImage, boxPoints[3], 3, Scalar(0, 255, 0), -1, 8);
    ///// Setting object corners /////

    /// /////five Point/////////////////////////////
    point = Point2f((float) 739, (float) 184); //640X480
    objectPoints.push_back(point);
    circle(sourceImage, objectPoints[0], 3, Scalar(0, 255, 0), -1, 8);

   // point = Point2f((float) 426.5, (float) 251.5); //640X480
    //objectPoints[0].push_back(point);
    //circle(sourceImage, objectPoints[0][1], 3, Scalar(0, 255, 0), -1, 8);
   // imshow("Source", sourceImage);

    //////////real world coordinates//////////////////////
    vector<Point3f> worldBoxPoints;
    Point3f tmpPoint;
    tmpPoint = Point3f((float) 2750, (float) 890, (float) 0);
    worldBoxPoints.push_back(tmpPoint);
    tmpPoint = Point3f((float) 3500, (float) 450, (float) 0);
    worldBoxPoints.push_back(tmpPoint);
    tmpPoint = Point3f((float) 2790, (float) -240, (float) 0);
    worldBoxPoints.push_back(tmpPoint);
    tmpPoint = Point3f((float) 3620, (float) -50, (float) 0);
    worldBoxPoints.push_back(tmpPoint);

    std::cout << "There are " << boxPoints.size() << " roomPoints and " << worldBoxPoints.size()
              << " worldRoomPoints." << std::endl;

    //////camera  intristic///////////////////////////////
    cv::Mat cameraMatrix1=Mat::eye(3, 3, cv::DataType<double>::type);

    cv::Mat distCoeffs1(5, 1, cv::DataType<double>::type);

    distCoeffs1.at<double>(0,0) = 0.061439051;
    distCoeffs1.at<double>(1,0) = 0.03187556;
    distCoeffs1.at<double>(2,0) = -0.00726151;
    distCoeffs1.at<double>(3,0) = -0.00111799;
    distCoeffs1.at<double>(4,0) = -0.00678974;

    //Taken from Mastring OpenCV d
    double fx = 328.61652824;
    double fy = 328.56512516;
    double cx = 629.80551148;
    double cy = 340.5442837;

    cameraMatrix1.at<double>(0, 0) = fx;
    cameraMatrix1.at<double>(1, 1) = fy;
    cameraMatrix1.at<double>(0, 2) = cx;
    cameraMatrix1.at<double>(1, 2) = cy;

    std::cout << "After calib cameraMatrix --- 1: " << cameraMatrix1 << std::endl;
    std::cout << "After calib distCoeffs: --- 1" << distCoeffs1 << std::endl;

    //////camera  intristic///////////////////////////////



    cv::Mat rvec1(3, 1, cv::DataType<double>::type);
    cv::Mat tvec1(3, 1, cv::DataType<double>::type);

    cv::solvePnP(worldBoxPoints, boxPoints, cameraMatrix1, distCoeffs1, rvec1, tvec1, false,CV_ITERATIVE);

    ///////////////R to theta/////////////////////////////
    std::cout << "rvec --- 1: " << rvec1 << std::endl;
    std::cout << "tvec --- 1: " << tvec1 << std::endl;
    cv::Mat rvecM1(3, 3, cv::DataType<double>::type);

    cv::Rodrigues(rvec1, rvecM1);

    const double PI=3.1415926;

    double thetaZ=atan2(rvecM1.at<double>(1,0),rvecM1.at<double>(0,0))/PI*180;

    double thetaY=atan2(-1*rvecM1.at<double>(2,0),sqrt(rvecM1.at<double>(2,1)*rvecM1.at<double>(2,1)
            +rvecM1.at<double>(2,2)*rvecM1.at<double>(2,2)))/PI*180;
    double thetaX=atan2(rvecM1.at<double>(2,1),rvecM1.at<double>(2,2))/PI*180;

    cout<<"theta x  "<<thetaX<<endl<<"theta Y: "<<thetaY<<endl<<"theta Z: "<<thetaZ<<endl;

    ///////////////compute  S ////////////////////////
    cv::Mat imagePoint = cv::Mat::ones(3, 1, cv::DataType<double>::type); //u,v=1,1,1 //計算參數S

    cv::Mat tempMat, tempMat2;

    imagePoint.at<double>(0,0)=558;
    imagePoint.at<double>(1,0)=259;
    double zConst = 0;//實際座標系的距離

    //計算參數
    double s;

    tempMat = rvecM1.inv() * cameraMatrix1.inv() * imagePoint;

    tempMat2 = rvecM1.inv() * tvec1;

    s = zConst + tempMat2.at<double>(2, 0);

    s /= tempMat.at<double>(2, 0);
    cout<<"s : "<<s<<endl;
    ///////////////compute  S ////////////////////////
      ///3D to 2D////////////////////////////
    cv::Mat worldPoints=Mat::ones(4,1,cv::DataType<double>::type);

    //setIdentity( worldPoints);
    worldPoints.at<double>(0,0)=3620;

   // Point3f worldPoints(4460,-660,0);

    worldPoints.at<double>(1,0)=-590;

    worldPoints.at<double>(2,0)=0;

    cout<<"world Points :  "<<worldPoints<<endl;

    Mat image_points=Mat::ones(3,1,cv::DataType<double>::type);

    //setIdentity(image_points);

    Mat RT_;

    hconcat(rvecM1,tvec1,RT_);

    cout<<"RT_"<<RT_<<endl;

    image_points=cameraMatrix1*RT_*worldPoints/s;
    Mat D_Points=Mat::ones(3,1,cv::DataType<double>::type);
    D_Points.at<double>(0,0)=image_points.at<double>(0,0)/image_points.at<double>(2,0);
    D_Points.at<double>(1,0)=image_points.at<double>(1,0)/image_points.at<double>(2,0);

    //cv::projectPoints(worldPoints, rvec1, tvec1, cameraMatrix1, distCoeffs1, imagePoints);
    cout<<"3D to 2D:   "<<D_Points<<endl;


    Mat camera_cordinates=-rvecM1.inv()*tvec1;

    cout<<"camera_coordinates:  "<<camera_cordinates<<endl;


    /////////////////////2D to 3D///////////////////////
    cv::Mat imagePoint_your_know = cv::Mat::ones(3, 1, cv::DataType<double>::type); //u,v,1

    imagePoint_your_know.at<double>(0, 0) = 558;

    imagePoint_your_know.at<double>(1, 0) = 259;

    Mat wcPoint = rvecM1.inv() * (cameraMatrix1.inv() *s*imagePoint_your_know - tvec1);

    Point3f worldPoint(wcPoint.at<double>(0, 0), wcPoint.at<double>(1, 0), wcPoint.at<double>(2, 0));

    cout <<"2D to 3D :"<< wcPoint << endl;
    /////////////////////2D to 3D///////////////////////
    imshow("Source",sourceImage);

    waitKey(0);

    return 0;
}

5 相比於C++版本,python版本真是簡單了好多啊。。。

import cv2
from numpy import *
import numpy as np
import math
object_3d_points = np.array(([2750, 890, 0],
                            [3500, 450, 0],
                            [3620, -50,  0],
                            [2790, -240, 0]), dtype=np.double)
object_2d_point = np.array(([558, 259],
                            [629, 212],
                            [693, 209],
                            [744, 260]), dtype=np.double)
camera_matrix = np.array(([328.47364883, 0, 629.0359318],
                         [0, 328.4303046, 340.13011656],
                         [0, 0, 1.0]), dtype=np.double)

# dist_coefs = np.array([0.06147452, 0.03188344, -0.00678974, 0, 0], dtype=np.double)
dist_coefs = np.array([0.06147452, 0.03188344, 0, 0, -0.00678974], dtype=np.double)
# 求解相機位姿
found, rvec, tvec = cv2.solvePnP(object_3d_points, object_2d_point, camera_matrix, dist_coefs)
# print(rvec, tvec)
rotM = cv2.Rodrigues(rvec)[0]
camera_postion = -np.matrix(rotM).T * np.matrix(tvec)
# print(camera_postion.T)
# 驗證根據博客http://www.cnblogs.com/singlex/p/pose_estimation_1.html提供方法求解相機位姿
# 計算相機座標系的三軸旋轉歐拉角,旋轉後可以轉出世界座標系。旋轉順序z,y,x
thetaZ = math.atan2(rotM[1, 0], rotM[0, 0])*180.0/math.pi
thetaY = math.atan2(-1.0*rotM[2, 0], math.sqrt(rotM[2, 1]**2 + rotM[2, 2]**2))*180.0/math.pi
thetaX = math.atan2(rotM[2, 1], rotM[2, 2])*180.0/math.pi
# 相機座標系下值
x = tvec[0]
y = tvec[1]
z = tvec[2]


# 進行三次旋轉
def RotateByZ(Cx, Cy, thetaZ):
    rz = thetaZ*math.pi/180.0
    outX = math.cos(rz)*Cx - math.sin(rz)*Cy
    outY = math.sin(rz)*Cx + math.cos(rz)*Cy
    return outX, outY


def RotateByY(Cx, Cz, thetaY):
    ry = thetaY*math.pi/180.0
    outZ = math.cos(ry)*Cz - math.sin(ry)*Cx
    outX = math.sin(ry)*Cz + math.cos(ry)*Cx
    return outX, outZ


def RotateByX(Cy, Cz, thetaX):
    rx = thetaX*math.pi/180.0
    outY = math.cos(rx)*Cy - math.sin(rx)*Cz
    outZ = math.sin(rx)*Cy + math.cos(rx)*Cz
    return outY, outZ


(x, y) = RotateByZ(x, y, -1.0*thetaZ)
(x, z) = RotateByY(x, z, -1.0*thetaY)
(y, z) = RotateByX(y, z, -1.0*thetaX)
Cx = x*-1
Cy = y*-1
Cz = z*-1
# 輸出相機位置
print(Cx, Cy, Cz)
# 輸出相機旋轉角
print(thetaX, thetaY, thetaZ)
# 對第五個點進行驗證
Out_matrix = np.concatenate((rotM, tvec), axis=1)
pixel = np.dot(camera_matrix, Out_matrix)
pixel1 = np.dot(pixel, np.array([4120, 270, 0, 1], dtype=np.double))
pixel2 = pixel1/pixel1[2]
# print(pixel1[2])
print(pixel2)

# 計算參數S
temmat = mat(rotM).I * mat(camera_matrix).I * mat([754, 217, 1]).T
temmat2 = mat(rotM).I * mat(tvec)
# print(temmat, temmat2)
s = temmat2[2]
s = s/temmat[2]
# print(s)


# 計算世界座標
wcpoint = mat(rotM).I * (s[0, 0] * mat(camera_matrix).I * mat([754, 217, 1]).T - mat(tvec))
print(wcpoint)
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章