個人博客:http://www.chenjianqu.com/
原文鏈接:http://www.chenjianqu.com/show-106.html
在博文<ORB-SLAM2編譯安裝>中,我們編譯安裝了ORB-SLAM2,也運行了其自帶的ROS例程。但是該ROS程序基於rosbuild編譯,而我們現在更多的使用catkin,參考作用有限。這篇文章記錄了基於ROS調用ORB-SLAM2的過程。所謂的調用是指ORB-SLAM2接收ROS消息作爲輸入,並由ROS將位姿估計的結果發佈出去。
一、新建ROS項目
<ROS-創建功能包和節點>中我們介紹了ROS工作空間和功能包的創建過程。這裏創建slam功能包,如下:
#創建工作空間
(base) chenjianqu@chen:~$ cd ros
(base) chenjianqu@chen:~/ros$ cd project
(base) chenjianqu@chen:~/ros/project$ mkdir -p ws2/src
(base) chenjianqu@chen:~/ros/project$ cd ws2
(base) chenjianqu@chen:~/ros/project/ws2$ catkin_make #初始化工作空間
#創建功能包
(base) chenjianqu@chen:~/ros/project/ws2$ cd src
(base) chenjianqu@chen:~/ros/project/ws2/src$ catkin_create_pkg slam cv_bridge image_transport roscpp sensor_msgs std_msgs tf message_filters
二、編寫源代碼
先編寫源代碼,在src目錄下增加orb_slam_run.cpp:
#include <stdio.h>
#include <iostream>
#include <algorithm>
#include <fstream>
#include <chrono>
#include <ros/ros.h>
#include <image_transport/image_transport.h>
#include <cv_bridge/cv_bridge.h>
#include <message_filters/subscriber.h>
#include <message_filters/time_synchronizer.h>
#include <message_filters/sync_policies/approximate_time.h>
#include <tf/transform_broadcaster.h>
#include <eigen_conversions/eigen_msg.h>
#include <std_msgs/Time.h>
#include <opencv2/core/core.hpp>
#include <opencv2/core/eigen.hpp>
#include <Eigen/Core>
#include <Eigen/Geometry>
#include <Eigen/Dense>
#include"System.h"
using namespace std;
class ImageGrabber
{
public:
ImageGrabber(ORB_SLAM2::System* pSLAM);
void GrabRGBD(const sensor_msgs::ImageConstPtr& msgRGB,const sensor_msgs::ImageConstPtr& msgD);
protected:
ORB_SLAM2::System* mpSLAM;
tf::TransformBroadcaster* br;
};
int main(int argc, char** argv)
{
ros::init(argc, argv, "orb_slam_node");//初始化節點
ros::start();//啓動節點
if(argc != 3){
cout<<"需要傳入參數:視覺詞典路徑 配置文件路徑" << endl;
ros::shutdown();//關閉節點
return 1;
}
//初始化ORB-SLAM2
ORB_SLAM2::System SLAM(argv[1],argv[2],ORB_SLAM2::System::RGBD,true);
ImageGrabber igb(&SLAM);
ros::NodeHandle nh;
message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, "/camera/rgb/image_raw", 1);
message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, "/camera/depth_registered/image_raw", 1);
typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::Image, sensor_msgs::Image> sync_pol;
message_filters::Synchronizer<sync_pol> sync(sync_pol(10), rgb_sub,depth_sub);
sync.registerCallback(boost::bind(&ImageGrabber::GrabRGBD,&igb,_1,_2));
ros::spin();
SLAM.Shutdown();
SLAM.SaveKeyFrameTrajectoryTUM("KeyFrameTrajectory.txt");
ros::shutdown();
return 0;
}
ImageGrabber::ImageGrabber(ORB_SLAM2::System* pSLAM):mpSLAM(pSLAM)
{
br=new tf::TransformBroadcaster();
}
void ImageGrabber::GrabRGBD(const sensor_msgs::ImageConstPtr& msgRGB,const sensor_msgs::ImageConstPtr& msgD)
{
ros::Time timestamp= msgRGB->header.stamp;//時間戳
cv_bridge::CvImageConstPtr cv_ptrRGB;
try{
cv_ptrRGB = cv_bridge::toCvShare(msgRGB);
}
catch (cv_bridge::Exception& e){
ROS_ERROR("cv_bridge exception: %s", e.what());
return;
}
cv_bridge::CvImageConstPtr cv_ptrD;
try{
cv_ptrD = cv_bridge::toCvShare(msgD);
}
catch (cv_bridge::Exception& e){
ROS_ERROR("cv_bridge exception: %s", e.what());
return;
}
//調用ORB-SLAM2
cv::Mat Tcw=mpSLAM->TrackRGBD(cv_ptrRGB->image,cv_ptrD->image,cv_ptrRGB->header.stamp.toSec());
tf::Transform m;
//設置平移
m.setOrigin(
tf::Vector3(
Tcw.at<float>(0,3),
Tcw.at<float>(1,3),
Tcw.at<float>(2,3)
)
);
//設置旋轉
tf::Matrix3x3 Rcw;
Rcw.setValue( //Mat轉換爲Matrix
Tcw.at<float>(0,0),Tcw.at<float>(0,1),Tcw.at<float>(0,2),
Tcw.at<float>(1,0),Tcw.at<float>(1,1),Tcw.at<float>(1,2),
Tcw.at<float>(2,0),Tcw.at<float>(2,1),Tcw.at<float>(2,2)
);
double roll,pitch,yaw;
Rcw.getRPY(roll, pitch, yaw, 1);//Matrix轉換爲RPY
tf::Quaternion q;
q.setRPY(roll,pitch,yaw);//RPY轉換爲四元數
m.setRotation(q);
//發佈座標
br->sendTransform(tf::StampedTransform(m, timestamp, "world", "orb_slam2"));
}
如果閱讀過ORB-SLAM2源代碼的話,就知道ORB-SLAM2的入口程序定義在System.cc裏面,其中TrackRGBD()函數是RGBD版ORB-SLAM2的入口函數,該函數接受opencv格式的rgb圖和深度圖作爲輸入,因此需要通過cv_bridge將sensor_msgs/Image格式轉換爲opencv格式。該回調函數封裝成類的原因是便於傳參。
深度圖和RGB圖是兩個不同的消息,因此需要通過message_filters進行時間上的對準。
ORB-SLAM2返回的位姿是cv::Mat格式的,需要先轉爲tf::Matrix,再轉爲旋轉角,再轉爲四元數,最後才能放到tf::Transform裏面,最後發佈的是world幀到orb_slam2幀的位姿。有關ros中的TF內容可參考<ROS-座標轉換>。
下面編寫CMakeLists.txt:
cmake_minimum_required(VERSION 2.8.3)
project(ps_slam)
set(CMAKE_BUILD_TYPE Release)
add_compile_options(-std=c++11)
find_package(catkin REQUIRED COMPONENTS
cv_bridge
image_transport
roscpp
sensor_msgs
std_msgs
tf
message_filters
)
#ORB_SLAM的路徑
set(ORB_SLAM_PATH /home/chen/ros/ORB_SLAM2)
message("ORB_SLAM_PATH = ${ORB_SLAM_PATH} ")
LIST(APPEND CMAKE_MODULE_PATH ${ORB_SLAM_PATH}/cmake_modules)
find_package(OpenCV)
find_package(Eigen3 3.1.0 REQUIRED)
find_package(Pangolin REQUIRED)
catkin_package(
# INCLUDE_DIRS include
# LIBRARIES serve_test
# CATKIN_DEPENDS roscpp rospy std_msgs
# DEPENDS system_lib
)
include_directories(
include
${catkin_INCLUDE_DIRS}
${OpenCV_INCLUDE_DIRS}
${ORB_SLAM_PATH}
${ORB_SLAM_PATH}/include
${Pangolin_INCLUDE_DIRS}
)
set(LIBS
${OpenCV_LIBS}
${EIGEN3_LIBS}
${Pangolin_LIBRARIES}
${ORB_SLAM_PATH}/Thirdparty/DBoW2/lib/libDBoW2.so
${ORB_SLAM_PATH}/Thirdparty/g2o/lib/libg2o.so
${ORB_SLAM_PATH}/lib/libORB_SLAM2.so
-lboost_system
)
add_executable(orb_slam_node src/orb_slam_run.cpp)
target_link_libraries(orb_slam_node
${catkin_LIBRARIES}
${LIBS}
)
上面應該不用怎麼解釋。
三、運行並測試
上面寫好源代碼後,在工作空間目錄下進行編譯:
catkin_make -j4
然後刷新路徑:
source devel/setup.bash
接着把詞典文件和配置文件準備好,ORB詞典路徑是ORB-SLAM2/Vocabulary/ORBvoc.txt,配置文件路徑是ORB-SLAM2/Examples/ROS/ORB-SLAM2/Asus.yaml,我把它們放在工作空間的data目錄下,跑起來:
#先開一個終端: roscore #在當前終端 rosrun ps_slam orb_slam_node data/ORBvoc.txt data/Asus.yaml
接着運行數據集:
rosbag play /media/chen/chen/SLAM/datasets/rgbd_dataset_freiburg1_desk.bag \ /camera/rgb/image_color:=/camera/rgb/image_raw \ /camera/depth/image:=/camera/depth_registered/image_raw
上面運行數據包的時候,將發佈節點名稱 重映射到 我們上面程序中設置接收的名稱。
這個時候就可以看到ORB_SLAM2可視化界面出現地圖點和關鍵幀了。此時再開一個終端,查看tf發佈的情況:
rosrun tf tf_monitor
即可看到orb_slam2存在一個座標。再查看具體的內容:
rosrun tf tf_echo world orb_slam2
執行後可以看到orb_slam2估計的位姿情況。
四、Rviz顯示
可以通過rviz顯示當前相機的運動情況,啓動rviz:
ros rviz rviz
然後通過Axis顯示: