ORB/BRISK/AKAZE特徵點提取、特徵匹配的性能比較

寫在前面

局部特徵相關算法在過去二十年期間風靡一時,其中代表的有SIFT、SURF算法等(廣泛應用於目標檢測、識別、匹配定位中),這兩種算法是用金字塔策略構建高斯尺度空間(SURF算法採用框濾波來近似高斯函數)。不論SIFT還是SURF算法在構造尺度空間時候存在一個重要的缺點:高斯模糊不保留對象邊界信息並且在所有尺度上平滑到相同程度的細節與噪聲,影響定位的準確性和獨特性。
  針對高斯核函數構建尺度空間的缺陷,有學者提出了非線性濾波構建尺度空間:雙邊濾波、非線性擴散濾波方式。非線性濾波策略構建尺度空間主要能夠局部自適應進行濾除小細節同時保留目標的邊界使其尺度空間保留更多的特徵信息。例如:BFSIFT採取雙邊濾波與雙向匹配方式改善SIFT算法在SAR圖像上匹配性能低下的問題(主要由於SAR圖像斑點噪聲嚴重),但是付出更高的計算複雜度。AKAZE作者之前提出的KAZE算法採取非線性擴散濾波相比於SIFT與SURF算法提高了可重複性和獨特性。但是KAZE算法缺點在於計算密集,通過AOS數值逼近的策略來求解非線性擴散方程,雖然AOS求解穩定並且可並行化,但是需要求解大型線性方程組,在移動端實時性要求難以滿足。

特徵點的提取與特徵匹配的概述

特徵點提取,輸入一幅圖片,提取出能代表這幅圖片特徵的一些點
這些點根據不同提特徵的算法,具有一些性質

特徵提取通常分爲兩步:

  • 檢測特徵點 keypoints
  • 計算描述子 descriptors

特徵匹配: 通過計算的描述子,運用一定的度量方式,對描述子進行匹配,輸出匹配成功的特徵點對 maches

ORB 特徵

BRISK 特徵

Binary Robust Invariant Scalable Keypoints
2011年ICCV上提出來的一種特徵提取算法,BRISK算法中構造了圖像金字塔進行多尺度表達,因此具有較好的旋轉不變性、尺度不變性,較好的魯棒性等。
在圖像配準應用中,速度比較:SIFT<SURF<BRISK<FREAK<ORB,在對有較大模糊的圖像配準時,BRISK算法在其中表現最爲出色。

AKAZE 特徵

KAZE是EECV 2012年新提出來的特徵點檢測和描述算法,AKAZE是在KAZE基礎上進行改進的. (Accelerated-KAZE)

作者目的在於如何將局部特徵算法應用到移動設備(由於移動設備資源有限同時實時性要求較高),主要正對KAZE算法改進一下兩點

  • 1 利用非線性擴散濾波的優勢獲取低計算要求的特徵,因此作者引入快速顯示擴散數學框架FED來快速求解偏微分方程**。採用FED來建立尺度空間要比當下其它的非線性模式建立尺度空間都要快,同時比AOS更加準確。
  • 2 引入一個高效的改進局部差分二進制描述符(M-LDB),較原始LDB增加了旋轉與尺度不變的魯棒性,結合FED構建的尺度空間梯度信息增加了獨特性。

與SIFT、SURF算法相比,AKAZE算法更快同時與ORB、BRISK算法相比,可重複性與魯棒性提升很大。

OpenCV3實現代碼

#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/features2d/features2d.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <chrono>  //time
// #include <opencv2/calib3d/calib3d.hpp>
// #include "extra.h" // use this if in OpenCV2 
using namespace std;
using namespace cv;


// FAST特徵提取
void FAST_feature(const Mat& img, std::vector<KeyPoint>& keypoints);

//orb特徵提取與匹配
void ORB_feature_matches (
    const Mat& img_1, const Mat& img_2,
    std::vector<KeyPoint>& keypoints_1,
    std::vector<KeyPoint>& keypoints_2,
    std::vector< DMatch >& matches );


//BRISK 特徵提取與匹配
void BRISK_feature_matches (
    const Mat& img_1, const Mat& img_2,
    std::vector<KeyPoint>& keypoints_1,
    std::vector<KeyPoint>& keypoints_2,
    std::vector< DMatch >& matches );

//BRISK 特徵提取與匹配
void AKAZE_feature_matches (
    const Mat& img_1, const Mat& img_2,
    std::vector<KeyPoint>& keypoints_1,
    std::vector<KeyPoint>& keypoints_2,
    std::vector< DMatch >& matches );


int main ( int argc, char** argv )
{
    if ( argc != 3 )
    {
        cout<<"usage: inpute img1 img2"<<endl;
        return 1;
    }
    //-- 讀取圖像
    Mat img_1 = imread ( argv[1], CV_LOAD_IMAGE_COLOR );
    Mat img_2 = imread ( argv[2], CV_LOAD_IMAGE_COLOR );

// FAST -FEATURE
    vector<KeyPoint> fast_keypoints;
    FAST_feature(img_1, fast_keypoints);
    cout<<"use FAST_feature_matches found "<<fast_keypoints.size() <<"keypoints"<<endl;
    Mat fast_feature_image;
    drawKeypoints(img_1, fast_keypoints, fast_feature_image, Scalar::all(-1), DrawMatchesFlags::DEFAULT);
    imshow("FAST_feature_image", fast_feature_image);
    cout << "\r\n" << endl;

// ORB-FEATURE
    vector<KeyPoint> orb_keypoints_1, orb_keypoints_2;
    vector<DMatch> orb_matches;
    ORB_feature_matches ( img_1, img_2, orb_keypoints_1, orb_keypoints_2, orb_matches );
    cout<<"use ORB_feature_matches found"<<orb_keypoints_1.size() <<"keypoints"<<endl;
    // 繪製orb特徵點與特徵匹配
    Mat orb_feature_image, orb_matches_image;
    drawKeypoints(img_1, orb_keypoints_1, orb_feature_image, Scalar::all(-1), DrawMatchesFlags::DEFAULT);
    imshow("ORB_feature_image", orb_feature_image);
    drawMatches ( img_1, orb_keypoints_1, img_2, orb_keypoints_2, orb_matches, orb_matches_image );
    imshow("ORB_matches_image", orb_matches_image);
    cout << "\r\n" << endl;

// BRISK-FEATURE
    vector<KeyPoint> brisk_keypoint_1, brisk_keypoint_2;
    vector<DMatch> brisk_matches;
    BRISK_feature_matches(img_1, img_2, brisk_keypoint_1, brisk_keypoint_2, brisk_matches);
    cout << "use BRISK_feature_matches found " << brisk_keypoint_1.size() << "keypoints" << endl;

    Mat brisk_feature_image, brisk_matches_image;
    drawKeypoints(img_1, brisk_keypoint_1, brisk_feature_image, Scalar::all(-1), DrawMatchesFlags::DEFAULT);
    imshow("BRISK feature image", brisk_feature_image);
    drawMatches(img_1, brisk_keypoint_1, img_2 ,brisk_keypoint_2, brisk_matches, brisk_matches_image);
    imshow("BRISK matches image", brisk_matches_image);
    cout << "\r\n" << endl;

// AKAZE-FEATURE
    vector<KeyPoint> akaze_keypoint_1, akaze_keypoint_2;
    vector<DMatch> akaze_matches;
    AKAZE_feature_matches(img_1, img_2, akaze_keypoint_1, akaze_keypoint_2, akaze_matches);
    cout << "use akaze_feature_matches found " << akaze_keypoint_1.size() << "keypoints" << endl;

    Mat akaze_feature_image, akaze_matches_image;
    drawKeypoints(img_1, akaze_keypoint_1, akaze_feature_image, Scalar::all(-1), DrawMatchesFlags::DEFAULT);
    imshow("akaze feature image", akaze_feature_image);
    drawMatches(img_1, akaze_keypoint_1, img_2 ,akaze_keypoint_2, akaze_matches, akaze_matches_image);
    imshow("akaze matches image", akaze_matches_image);

    cvWaitKey(0);
    return 0;
}


void FAST_feature(const Mat& img, std::vector<KeyPoint>& keypoints)
{
    const int threshold = 50;
    const bool nonmaxSuppression = true;
    const int type = FastFeatureDetector::TYPE_7_12;

    Ptr<FeatureDetector> fast_feature = FastFeatureDetector::create(threshold,  nonmaxSuppression, type);
    // Ptr<FeatureDetector> fast_feature = FastFeatureDetector::create();
    fast_feature->detect(img, keypoints);
    
}

void ORB_feature_matches ( const Mat& img_1, const Mat& img_2,
                            std::vector<KeyPoint>& keypoints_1,
                            std::vector<KeyPoint>& keypoints_2,
                            std::vector< DMatch >& matches )
{
    //-- 初始化
    Mat descriptors_1, descriptors_2;
    // used in OpenCV3 
    Ptr<FeatureDetector> detector = ORB::create(500);
    Ptr<DescriptorExtractor> descriptor = ORB::create();
    // use this if you are in OpenCV2 
    // Ptr<FeatureDetector> detector = FeatureDetector::create ( "ORB" );
    // Ptr<DescriptorExtractor> descriptor = DescriptorExtractor::create ( "ORB" );
    Ptr<DescriptorMatcher> matcher  = DescriptorMatcher::create ( "BruteForce-Hamming" );
    //-- 第一步:檢測 Oriented FAST 角點位置
    chrono::steady_clock::time_point t1 = chrono::steady_clock::now();
    detector->detect ( img_1,keypoints_1 );
    detector->detect ( img_2,keypoints_2 );
    chrono::steady_clock::time_point t2 = chrono::steady_clock::now();
    chrono::duration<double> time_used1 = chrono::duration_cast<chrono::duration<double>>( t2-t1 );
    cout << "ORB Feature detector time: " << time_used1.count() << "seconds" <<endl;

    //-- 第二步:根據角點位置計算 BRIEF 描述子
    chrono::steady_clock::time_point t3 = chrono::steady_clock::now();
    descriptor->compute ( img_1, keypoints_1, descriptors_1 );
    descriptor->compute ( img_2, keypoints_2, descriptors_2 );
    chrono::steady_clock::time_point t4 = chrono::steady_clock::now();
    chrono::duration<double> time_used2 = chrono::duration_cast<chrono::duration<double>>( t4-t3 );
    cout << "ORB Descriptor Extractor time: " << time_used2.count() << "seconds" <<endl;    

    //-- 第三步:對兩幅圖像中的BRIEF描述子進行匹配,使用 Hamming 距離
    vector<DMatch> match;
    chrono::steady_clock::time_point t5 = chrono::steady_clock::now();
    //BFMatcher matcher ( NORM_HAMMING );
    matcher->match ( descriptors_1, descriptors_2, match );
    chrono::steady_clock::time_point t6 = chrono::steady_clock::now();
    chrono::duration<double> time_used3 = chrono::duration_cast<chrono::duration<double>>( t6-t5 );
    cout << "ORB Descriptor match time: " << time_used3.count() << "seconds" <<endl;      

    //-- 第四步:匹配點對篩選
    double min_dist=10000, max_dist=0;

    //找出所有匹配之間的最小距離和最大距離, 即是最相似的和最不相似的兩組點之間的距離
    for ( int i = 0; i < descriptors_1.rows; i++ )
    {
        double dist = match[i].distance;
        if ( dist < min_dist ) min_dist = dist;
        if ( dist > max_dist ) max_dist = dist;
    }

    printf ( "-- Max dist : %f \n", max_dist );
    printf ( "-- Min dist : %f \n", min_dist );

    //當描述子之間的距離大於兩倍的最小距離時,即認爲匹配有誤.但有時候最小距離會非常小,設置一個經驗值30作爲下限.
    for ( int i = 0; i < descriptors_1.rows; i++ )
    {
        if ( match[i].distance <= max ( 2*min_dist, 30.0 ) )
        {
            matches.push_back ( match[i] );
        }
    }
    printf ( "ORB-- All matches : %d \n", (int)match.size() );
    printf ( "ORB-- filter match : %d \n", (int)matches.size() );       
}

//BRISK 特徵提取與匹配
void BRISK_feature_matches (  const Mat& img_1, const Mat& img_2,
    std::vector<KeyPoint>& keypoints_1, std::vector<KeyPoint>& keypoints_2,    std::vector< DMatch >& matches )
{
    // BRISK 初始化 關鍵點和描述子
    Ptr<FeatureDetector>  brisk_feature = BRISK::create(40);
    Ptr<DescriptorExtractor> descriptor = BRISK::create();
    Ptr<DescriptorMatcher> matche = DescriptorMatcher::create( "BruteForce-Hamming" );


    //提取關鍵點 計算描述子
    chrono::steady_clock::time_point t7 = chrono::steady_clock::now();
    brisk_feature->detect(img_1,  keypoints_1);
    brisk_feature->detect(img_2,  keypoints_2);
    chrono::steady_clock::time_point t8 = chrono::steady_clock::now();
    chrono::duration<double> time_used4 = chrono::duration_cast<chrono::duration<double>>( t8-t7 );
    cout << "BRISK Feature detector time: " << time_used4.count() << "seconds" <<endl;    


    Mat descriptors_1, descriptors_2;
    chrono::steady_clock::time_point t3 = chrono::steady_clock::now();
    descriptor->compute(img_1, keypoints_1, descriptors_1);
    descriptor->compute(img_2, keypoints_2, descriptors_2);
    chrono::steady_clock::time_point t4 = chrono::steady_clock::now();
    chrono::duration<double> time_used2 = chrono::duration_cast<chrono::duration<double>>( t4-t3 );
    cout << "BRISK Descriptor Extractor time: " << time_used2.count() << "seconds" <<endl;    

    //匹配描述子 以及濾除誤匹配
    vector<DMatch> match;
    chrono::steady_clock::time_point t5 = chrono::steady_clock::now();
    matche->match(descriptors_1, descriptors_2,  match);
    chrono::steady_clock::time_point t6 = chrono::steady_clock::now();
    chrono::duration<double> time_used3 = chrono::duration_cast<chrono::duration<double>>( t6-t5 );
    cout << "BRISK Descriptor match time: " << time_used3.count() << "seconds" <<endl;    

    double min_dist = 10000, max_dist =0;
    for(int i=0; i< descriptors_1.rows; i++)
    {
        double dist = match[i].distance;
        if(dist < min_dist) min_dist = dist;
        if(dist > max_dist) max_dist = dist;
    }
    printf ( "BRESK-- Max dist : %f \n", max_dist );
    printf ( "BRESK-- Min dist : %f \n", min_dist );

    for(int j=0; j< descriptors_1.rows; j++)    
    {
        if(match[j].distance < max (2*min_dist, 30.0))
            matches.push_back(match[j]);
    }
    printf ( "BRESK-- All matches : %d \n", (int)match.size() );
    printf ( "BRESK-- filter match : %d \n", (int)matches.size() );   

}



//AKAZE 特徵提取與匹配
void AKAZE_feature_matches (  const Mat& img_1, const Mat& img_2,
    std::vector<KeyPoint>& keypoints_1, std::vector<KeyPoint>& keypoints_2,    std::vector< DMatch >& matches )
{
    // AKAZE 初始化 關鍵點和描述子
    Ptr<FeatureDetector>  akaze_feature = AKAZE::create();
    Ptr<DescriptorExtractor> descriptor = AKAZE::create();
    Ptr<DescriptorMatcher> matche = DescriptorMatcher::create( "BruteForce-Hamming" );


    //提取關鍵點 計算描述子
    chrono::steady_clock::time_point t7 = chrono::steady_clock::now();
    akaze_feature->detect(img_1,  keypoints_1);
    akaze_feature->detect(img_2,  keypoints_2);
    chrono::steady_clock::time_point t8 = chrono::steady_clock::now();
    chrono::duration<double> time_used4 = chrono::duration_cast<chrono::duration<double>>( t8-t7 );
    cout << "AKAZE Feature detector time: " << time_used4.count() << "seconds" <<endl;    


    Mat descriptors_1, descriptors_2;
    chrono::steady_clock::time_point t3 = chrono::steady_clock::now();
    descriptor->compute(img_1, keypoints_1, descriptors_1);
    descriptor->compute(img_2, keypoints_2, descriptors_2);
    chrono::steady_clock::time_point t4 = chrono::steady_clock::now();
    chrono::duration<double> time_used2 = chrono::duration_cast<chrono::duration<double>>( t4-t3 );
    cout << "AKAZE Descriptor Extractor time: " << time_used2.count() << "seconds" <<endl;    

    //匹配描述子 以及濾除誤匹配
    vector<DMatch> match;
    chrono::steady_clock::time_point t5 = chrono::steady_clock::now();
    matche->match(descriptors_1, descriptors_2,  match);
    chrono::steady_clock::time_point t6 = chrono::steady_clock::now();
    chrono::duration<double> time_used3 = chrono::duration_cast<chrono::duration<double>>( t6-t5 );
    cout << "AKAZE Descriptor match time: " << time_used3.count() << "seconds" <<endl;    

    double min_dist = 10000, max_dist =0;
    for(int i=0; i< descriptors_1.rows; i++)
    {
        double dist = match[i].distance;
        if(dist < min_dist) min_dist = dist;
        if(dist > max_dist) max_dist = dist;
    }
    printf ( "AKAZE-- Max dist : %f \n", max_dist );
    printf ( "AKAZE-- Min dist : %f \n", min_dist );

    for(int j=0; j< descriptors_1.rows; j++)    
    {
        if(match[j].distance < max (2*min_dist, 30.0))
            matches.push_back(match[j]);
    }
    printf ( "AKAZE-- All matches : %d \n", (int)match.size() );
    printf ( "AKAZE-- filter match : %d \n", (int)matches.size() );   

}
cmake_minimum_required( VERSION 2.8 )
project( vo1 )

set( CMAKE_BUILD_TYPE "Release" )
set( CMAKE_CXX_FLAGS "-std=c++11 -O3" )

find_package( OpenCV 3.1 REQUIRED )
include_directories( 
    ${OpenCV_INCLUDE_DIRS} 
)

add_executable( feature_extraction feature_extraction.cpp  )
target_link_libraries( feature_extraction ${OpenCV_LIBS} )

對比實驗

ORB Feature detector time: 0.344205seconds
ORB Descriptor Extractor time: 0.0210536seconds
ORB Descriptor match time: 0.00212562seconds
– Max dist : 95.000000
– Min dist : 7.000000
ORB-- All matches : 500
ORB-- filter match : 81
use ORB_feature_matches found500keypoints

================================================

BRISK Feature detector time: 0.0580227seconds
BRISK Descriptor Extractor time: 0.0406389seconds
BRISK Descriptor match time: 0.00707552seconds
BRESK-- Max dist : 183.000000
BRESK-- Min dist : 16.000000
BRESK-- All matches : 1117
BRESK-- filter match : 44
use BRISK_feature_matches found 1117keypoints

================================================

AKAZE Feature detector time: 0.121612seconds
AKAZE Descriptor Extractor time: 0.092555seconds
AKAZE Descriptor match time: 0.00412018seconds
AKAZE-- Max dist : 174.000000
AKAZE-- Min dist : 9.000000
AKAZE-- All matches : 773
AKAZE-- filter match : 73
use akaze_feature_matches found 773keypoints

  • ORB 特徵提取
    在這裏插入圖片描述

  • BRISK特徵提取
    在這裏插入圖片描述

  • AKAZE特徵提取
    在這裏插入圖片描述

  • ORB 特徵匹配
    在這裏插入圖片描述

  • BRISK 特徵匹配
    在這裏插入圖片描述

  • AKAZE 特徵匹配
    在這裏插入圖片描述

參考文檔

AKAZE算法分析

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章