視覺SLAM實戰(二):ORB-SLAM2 with Kinect2

前言

  實戰系列很久沒有更新了。近期拿到了一臺不錯的Thinkpad和Kinect v2,前兩天orbslam2又放出,於是想要在kinect2下嘗試一下orb slam。整個過程沒有特別多的技術含量,讀者可以把它當成一個實驗步驟的總結。


ORB-slam

  orb-slam是15年出的一個單目SLAM,也可以說是單目中做的非常好的一個實現。另一方面,他的代碼也極其清爽,編譯十分貼心,十分注重我等程序員的用戶體驗,受到了廣大歡迎。前幾天,orb-slam作者推出了orb-slam2,在原來的單目基礎上增加了雙目和RGBD的接口,儘管地圖還是單目常見的稀疏特徵點圖。於是我們就能通過各種傳感器來玩orb-slam啦!這裏正巧我手上有一個Kinectv2,咱們就拿它做個實驗吧!

  博主的系統就不多說了,ubuntu14.04, Thinkpad T450。Kinect2 for xbox.

  


編譯Kinect v2

  要在ubuntu下使用Kinect V2,需要做兩件事。一是編譯Kinectv2的開源驅動libfreenect2,二是使用kinect2_bridge在ROS下采集它的圖像。這兩者在hitcm的博客中已經說的很清楚了,咱就不羅嗦了。

  請參照hitcm的博客安裝好libfreenect2和iai_kinect2系列軟件:

  http://www.cnblogs.com/hitcm/p/5118196.html

  iai_kinect2中含有四個模塊。我們主要用它的bridge進行圖像間的轉換。此外,還要使用kinect2_registration進行標定。沒標定過的kinect2,深度圖和彩色圖之間是不保證一一對應的,在做slam時就會出錯。所以請務必做好它的標定。好在作者十分良心地寫了標定的詳細過程,即使像博主這樣的小白也能順利完成標定啦!

  kinect2_calibration模塊,有詳細的標定過程解釋:

  https://github.com/code-iai/iai_kinect2/tree/master/kinect2_calibration

  標定之後呢,會得到kinect2彩色頭、深度頭、紅外頭的內參和外參,它們都以(萬惡的)yaml模式存儲在你的機器內。kinect2_bridge會自動檢測你的標定文件,對深度圖進行校正。之後slam過程主要使用彩色頭的內參哦!同時,你也可以使用kinect2_viewer模塊來看獲取的點雲和kinect2的圖像哦!

  

  


編譯orb-slam2

  orb-slam2的github位於:https://github.com/raulmur/ORB_SLAM2 直接clone到本地即可。

  這版的orb-slam可以脫離ros編譯,不需要事先安裝ros。(但是由於我們要用kinect2還是裝了ros)它的主要依賴項是opencv2.4, eigen3, dbow2和g2o,另外還有一些我沒怎麼聽說過的UI庫:pangolin。其中Dbow2和g2o已經包含在它的Thirdparty文件夾中,不需要另外下載啦!opencv的安裝方式參見一起做第一篇。(g2o版本問題一直是個坑) 所以,只需要去下載pangolin即可。

  pangolin的github: https://github.com/stevenlovegrove/Pangolin 它是一個cmake工程,沒什麼特別的依賴項,直接下載編譯安裝即可。

  隨後,進入orb-slam2的文件夾。作者很貼心的爲我們準備了 build.sh 文件,直接運行這個文件即可完成編譯。

  但願你也順利編譯成功了。orb-slam作者爲我們提供了幾個example,包括kitti的雙目和tum的單目/rgbd。我們可以參照着它去寫自己的輸入。如果你只想把orb-slam2作爲一個整體的模塊,可以直接調用include/System.h文件裏定義的SLAM System哦。現在我們就把Kinect2丟給它試試。


在Kinect2上運行orb-slam2

  Kinect2的topic一共有三種,含不同的分辨率。其中hd是1920的,qhd是四分之一的960的,而sd是最小的。博主發現sd的效果不理想,而hd的圖像又太大了,建議大家使用qhd的920大小啦!在調用orb-slam時,需要把相機參數通過一個yaml來告訴它,所以需要簡單寫一下你的kinect參數嘍。比如像這樣:

%YAML:1.0

#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------

# Camera calibration and distortion parameters (OpenCV) 
Camera.fx: 529.97
Camera.fy: 526.97
Camera.cx: 477.44
Camera.cy: 261.87

Camera.k1: 0.05627
Camera.k2: -0.0742
Camera.p1: 0.00142
Camera.p2: -0.00169
Camera.k3: 0.0241

Camera.width: 960
Camera.height: 540

# Camera frames per second 
Camera.fps: 30.0

# IR projector baseline times fx (aprox.)
Camera.bf: 40.0

# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1

# Close/Far threshold. Baseline times.
ThDepth: 50.0

# Deptmap values factor 
DepthMapFactor: 1000.0

#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------

# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 1000

# ORB Extractor: Scale factor between levels in the scale pyramid 	
ORBextractor.scaleFactor: 1.2

# ORB Extractor: Number of levels in the scale pyramid	
ORBextractor.nLevels: 8

# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast			
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7

#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize:2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500

 ORB部分的參數我們就不用動啦。然後,對kinect2_viewer進行一定程度的改寫,加入ORBSLAM,就可以跑起來嘍:

#include <stdlib.h>
#include <stdio.h>
#include <iostream>
#include <sstream>
#include <string>
#include <vector>
#include <cmath>
#include <mutex>
#include <thread>
#include <chrono>

#include <ros/ros.h>
#include <ros/spinner.h>
#include <sensor_msgs/CameraInfo.h>
#include <sensor_msgs/Image.h>

#include <cv_bridge/cv_bridge.h>

#include <image_transport/image_transport.h>
#include <image_transport/subscriber_filter.h>

#include <message_filters/subscriber.h>
#include <message_filters/synchronizer.h>
#include <message_filters/sync_policies/exact_time.h>
#include <message_filters/sync_policies/approximate_time.h>

#include <kinect2_bridge/kinect2_definitions.h>

#include "orbslam2/System.h"

class Receiver
{
public:
  enum Mode
  {
    IMAGE = 0,
    CLOUD,
    BOTH
  };

private:
  std::mutex lock;

  const std::string topicColor, topicDepth;
  const bool useExact, useCompressed;

  bool updateImage, updateCloud;
  bool save;
  bool running;
  size_t frame;
  const size_t queueSize;

  cv::Mat color, depth;
  cv::Mat cameraMatrixColor, cameraMatrixDepth;
  cv::Mat lookupX, lookupY;

  typedef message_filters::sync_policies::ExactTime<sensor_msgs::Image, sensor_msgs::Image, sensor_msgs::CameraInfo, sensor_msgs::CameraInfo> ExactSyncPolicy;
  typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::Image, sensor_msgs::Image, sensor_msgs::CameraInfo, sensor_msgs::CameraInfo> ApproximateSyncPolicy;

  ros::NodeHandle nh;
  ros::AsyncSpinner spinner;
  image_transport::ImageTransport it;
  image_transport::SubscriberFilter *subImageColor, *subImageDepth;
  message_filters::Subscriber<sensor_msgs::CameraInfo> *subCameraInfoColor, *subCameraInfoDepth;

  message_filters::Synchronizer<ExactSyncPolicy> *syncExact;
  message_filters::Synchronizer<ApproximateSyncPolicy> *syncApproximate;

  std::thread imageViewerThread;
  Mode mode;

  std::ostringstream oss;
  std::vector<int> params;

  //RGBDSLAM  slam; //the slam object
  ORB_SLAM2::System* orbslam    =nullptr;

public:
  Receiver(const std::string &topicColor, const std::string &topicDepth, const bool useExact, const bool useCompressed)
    : topicColor(topicColor), topicDepth(topicDepth), useExact(useExact), useCompressed(useCompressed),
      updateImage(false), updateCloud(false), save(false), running(false), frame(0), queueSize(5),
      nh("~"), spinner(0), it(nh), mode(CLOUD)
  {
    cameraMatrixColor = cv::Mat::zeros(3, 3, CV_64F);
    cameraMatrixDepth = cv::Mat::zeros(3, 3, CV_64F);
    params.push_back(cv::IMWRITE_JPEG_QUALITY);
    params.push_back(100);
    params.push_back(cv::IMWRITE_PNG_COMPRESSION);
    params.push_back(1);
    params.push_back(cv::IMWRITE_PNG_STRATEGY);
    params.push_back(cv::IMWRITE_PNG_STRATEGY_RLE);
    params.push_back(0);

    string orbVocFile = "/home/xiang/catkin_ws/src/walle/config/ORBvoc.txt";
    string orbSetiingsFile = "/home/xiang/catkin_ws/src/walle/config/kinect2_sd.yaml";

    orbslam = new ORB_SLAM2::System( orbVocFile, orbSetiingsFile ,ORB_SLAM2::System::RGBD, true );
  }

  ~Receiver()
  {
      if (orbslam)
      {
          orbslam->Shutdown();
          delete orbslam;
      }
  }

  void run(const Mode mode)
  {
    start(mode);
    stop();
  }

  void finish() 
  {
  }
private:
  void start(const Mode mode)
  {
    this->mode = mode;
    running = true;

    std::string topicCameraInfoColor = topicColor.substr(0, topicColor.rfind('/')) + "/camera_info";
    std::string topicCameraInfoDepth = topicDepth.substr(0, topicDepth.rfind('/')) + "/camera_info";

    image_transport::TransportHints hints(useCompressed ? "compressed" : "raw");
    subImageColor = new image_transport::SubscriberFilter(it, topicColor, queueSize, hints);
    subImageDepth = new image_transport::SubscriberFilter(it, topicDepth, queueSize, hints);
    subCameraInfoColor = new message_filters::Subscriber<sensor_msgs::CameraInfo>(nh, topicCameraInfoColor, queueSize);
    subCameraInfoDepth = new message_filters::Subscriber<sensor_msgs::CameraInfo>(nh, topicCameraInfoDepth, queueSize);

    if(useExact)
    {
      syncExact = new message_filters::Synchronizer<ExactSyncPolicy>(ExactSyncPolicy(queueSize), *subImageColor, *subImageDepth, *subCameraInfoColor, *subCameraInfoDepth);
      syncExact->registerCallback(boost::bind(&Receiver::callback, this, _1, _2, _3, _4));
    }
    else
    {
      syncApproximate = new message_filters::Synchronizer<ApproximateSyncPolicy>(ApproximateSyncPolicy(queueSize), *subImageColor, *subImageDepth, *subCameraInfoColor, *subCameraInfoDepth);
      syncApproximate->registerCallback(boost::bind(&Receiver::callback, this, _1, _2, _3, _4));
    }

    spinner.start();

    std::chrono::milliseconds duration(1);
    while(!updateImage || !updateCloud)
    {
      if(!ros::ok())
      {
        return;
      }
      std::this_thread::sleep_for(duration);
    }
    createLookup(this->color.cols, this->color.rows);

    switch(mode)
    {
    case IMAGE:
      imageViewer();
      break;
    case BOTH:
      imageViewerThread = std::thread(&Receiver::imageViewer, this);
      break;
    }
  }

  void stop()
  {
    spinner.stop();

    if(useExact)
    {
      delete syncExact;
    }
    else
    {
      delete syncApproximate;
    }

    delete subImageColor;
    delete subImageDepth;
    delete subCameraInfoColor;
    delete subCameraInfoDepth;

    running = false;
    if(mode == BOTH)
    {
      imageViewerThread.join();
    }
  }

  void callback(const sensor_msgs::Image::ConstPtr imageColor, const sensor_msgs::Image::ConstPtr imageDepth,
                const sensor_msgs::CameraInfo::ConstPtr cameraInfoColor, const sensor_msgs::CameraInfo::ConstPtr cameraInfoDepth)
  {
    cv::Mat color, depth;

    readCameraInfo(cameraInfoColor, cameraMatrixColor);
    readCameraInfo(cameraInfoDepth, cameraMatrixDepth);
    readImage(imageColor, color);
    readImage(imageDepth, depth);

    // IR image input
    if(color.type() == CV_16U)
    {
      cv::Mat tmp;
      color.convertTo(tmp, CV_8U, 0.02);
      cv::cvtColor(tmp, color, CV_GRAY2BGR);
    }

    lock.lock();
    this->color = color;
    this->depth = depth;
    updateImage = true;
    updateCloud = true;
    lock.unlock();
  }

  void imageViewer()
  {
    cv::Mat color, depth;
    for(; running && ros::ok();)
    {
      if(updateImage)
      {
        lock.lock();
        color = this->color;
        depth = this->depth;
        updateImage = false;
        lock.unlock();

        if (orbslam)
        {
            orbslam->TrackRGBD( color, depth, ros::Time::now().toSec() );
        }
      }

    }

    cv::destroyAllWindows();
    cv::waitKey(100);
  }

  void readImage(const sensor_msgs::Image::ConstPtr msgImage, cv::Mat &image) const
  {
    cv_bridge::CvImageConstPtr pCvImage;
    pCvImage = cv_bridge::toCvShare(msgImage, msgImage->encoding);
    pCvImage->image.copyTo(image);
  }

  void readCameraInfo(const sensor_msgs::CameraInfo::ConstPtr cameraInfo, cv::Mat &cameraMatrix) const
  {
    double *itC = cameraMatrix.ptr<double>(0, 0);
    for(size_t i = 0; i < 9; ++i, ++itC)
    {
      *itC = cameraInfo->K[i];
    }
  }

  void dispDepth(const cv::Mat &in, cv::Mat &out, const float maxValue)
  {
    cv::Mat tmp = cv::Mat(in.rows, in.cols, CV_8U);
    const uint32_t maxInt = 255;

    #pragma omp parallel for
    for(int r = 0; r < in.rows; ++r)
    {
      const uint16_t *itI = in.ptr<uint16_t>(r);
      uint8_t *itO = tmp.ptr<uint8_t>(r);

      for(int c = 0; c < in.cols; ++c, ++itI, ++itO)
      {
        *itO = (uint8_t)std::min((*itI * maxInt / maxValue), 255.0f);
      }
    }

    cv::applyColorMap(tmp, out, cv::COLORMAP_JET);
  }

  void combine(const cv::Mat &inC, const cv::Mat &inD, cv::Mat &out)
  {
    out = cv::Mat(inC.rows, inC.cols, CV_8UC3);

    #pragma omp parallel for
    for(int r = 0; r < inC.rows; ++r)
    {
      const cv::Vec3b
      *itC = inC.ptr<cv::Vec3b>(r),
       *itD = inD.ptr<cv::Vec3b>(r);
      cv::Vec3b *itO = out.ptr<cv::Vec3b>(r);

      for(int c = 0; c < inC.cols; ++c, ++itC, ++itD, ++itO)
      {
        itO->val[0] = (itC->val[0] + itD->val[0]) >> 1;
        itO->val[1] = (itC->val[1] + itD->val[1]) >> 1;
        itO->val[2] = (itC->val[2] + itD->val[2]) >> 1;
      }
    }
  }


  void createLookup(size_t width, size_t height)
  {
    const float fx = 1.0f / cameraMatrixColor.at<double>(0, 0);
    const float fy = 1.0f / cameraMatrixColor.at<double>(1, 1);
    const float cx = cameraMatrixColor.at<double>(0, 2);
    const float cy = cameraMatrixColor.at<double>(1, 2);
    float *it;

    lookupY = cv::Mat(1, height, CV_32F);
    it = lookupY.ptr<float>();
    for(size_t r = 0; r < height; ++r, ++it)
    {
      *it = (r - cy) * fy;
    }

    lookupX = cv::Mat(1, width, CV_32F);
    it = lookupX.ptr<float>();
    for(size_t c = 0; c < width; ++c, ++it)
    {
      *it = (c - cx) * fx;
    }
  }
};

int main(int argc, char **argv)
{
  ros::init(argc, argv, "kinect2_slam", ros::init_options::AnonymousName);

  if(!ros::ok())
  {
    return 0;
  }
  std::string topicColor = "/kinect2/sd/image_color_rect";
  std::string topicDepth = "/kinect2/sd/image_depth_rect";
  bool useExact = true;
  bool useCompressed = false;
  Receiver::Mode mode = Receiver::IMAGE;
  // 初始化receiver
  Receiver receiver(topicColor, topicDepth, useExact, useCompressed);

  //OUT_INFO("starting receiver...");
  receiver.run(mode);

  receiver.finish();

  ros::shutdown();

  return 0;
}

編譯方面,只要在CMakeLists.txt中加入orb-slam的頭文件和庫,告訴cmake你想鏈接它即可。甚至你可以把整個orb-slam放到你的代碼目錄中一塊兒編譯,不過我還是簡單地把liborb_slam2.so文件和頭文件拷了過來而已。

  實際的手持kinect2運行效果(由於博客園無法傳視頻,暫時把百度雲當播放器使一使):http://pan.baidu.com/s/1eRcyW1s (感謝也冬同學友情出演……)

  一起做rgbd slam的數據集上效果:http://pan.baidu.com/s/1bocx5s

   大體上還是挺理想的。


小結

  本文主要展現了orbslam2在Kinect2下的表現,大致是令人滿意的。讀者在使用時,請務必注意kinect2的標定,否則很可能出錯。


發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章