OpenCV Orb在引入旋转/缩放不变性后无法找到匹配项。

20
我正在使用OpenCV 2.3.1中的Orb特征检测器进行项目开发。我在8个不同的图像之间寻找匹配,其中6个非常相似(摄像机位置有20厘米的差异,沿着线性滑块移动,因此不存在比例或旋转变化),然后有2张图像从两侧以大约45度的角度拍摄。我的代码能够在非常相似的图像之间找到许多准确的匹配,但几乎没有为从不同角度拍摄的图像找到匹配。我已包含了我认为与问题相关的代码部分,请告诉我是否需要更多信息。
// set parameters

int numKeyPoints = 1500;
float distThreshold = 15.0;

//instantiate detector, extractor, matcher

detector = new cv::OrbFeatureDetector(numKeyPoints);
extractor = new cv::OrbDescriptorExtractor;
matcher = new cv::BruteForceMatcher<cv::HammingLUT>;

//Load input image detect keypoints

cv::Mat img1;
std::vector<cv::KeyPoint> img1_keypoints;
cv::Mat img1_descriptors;
cv::Mat img2;
std::vector<cv::KeyPoint> img2_keypoints
cv::Mat img2_descriptors;
img1 = cv::imread(fList[0].string(), CV_LOAD_IMAGE_GRAYSCALE);
img2 = cv::imread(fList[1].string(), CV_LOAD_IMAGE_GRAYSCALE);
detector->detect(img1, img1_keypoints);
detector->detect(img2, img2_keypoints);
extractor->compute(img1, img1_keypoints, img1_descriptors);
extractor->compute(img2, img2_keypoints, img2_descriptors);

//Match keypoints using knnMatch to find the single best match for each keypoint
//Then cull results that fall below given distance threshold

std::vector<std::vector<cv::DMatch> > matches;
matcher->knnMatch(img1_descriptors, img2_descriptors, matches, 1);
int matchCount=0;
for (int n=0; n<matches.size(); ++n) {
    if (matches[n].size() > 0){
        if (matches[n][0].distance > distThreshold){
            matches[n].erase(matches[n].begin());
        }else{
            ++matchCount;
        }
    }
}
3个回答

28

改变筛选匹配的方法后,我得到了足够的有用匹配结果。以前的方法仅凭距离值就把很多好的匹配结果舍弃了。在《OpenCV2计算机视觉应用程序设计秘籍》中找到的RobustMatcher类效果非常好。现在所有的匹配结果都很准确,我通过增加ORB检测器搜索的关键点数量得到了足够好的结果。使用SIFT或SURF和RobustMatcher仍然可以得到更好的结果,但我现在已经能够获得可用的ORB数据。

//RobustMatcher class taken from OpenCV2 Computer Vision Application Programming Cookbook Ch 9
class RobustMatcher {
  private:
     // pointer to the feature point detector object
     cv::Ptr<cv::FeatureDetector> detector;
     // pointer to the feature descriptor extractor object
     cv::Ptr<cv::DescriptorExtractor> extractor;
     // pointer to the matcher object
     cv::Ptr<cv::DescriptorMatcher > matcher;
     float ratio; // max ratio between 1st and 2nd NN
     bool refineF; // if true will refine the F matrix
     double distance; // min distance to epipolar
     double confidence; // confidence level (probability)
  public:
     RobustMatcher() : ratio(0.65f), refineF(true),
                       confidence(0.99), distance(3.0) {
        // ORB is the default feature
        detector= new cv::OrbFeatureDetector();
        extractor= new cv::OrbDescriptorExtractor();
        matcher= new cv::BruteForceMatcher<cv::HammingLUT>;
     }

  // Set the feature detector
  void setFeatureDetector(
         cv::Ptr<cv::FeatureDetector>& detect) {
     detector= detect;
  }
  // Set the descriptor extractor
  void setDescriptorExtractor(
         cv::Ptr<cv::DescriptorExtractor>& desc) {
     extractor= desc;
  }
  // Set the matcher
  void setDescriptorMatcher(
         cv::Ptr<cv::DescriptorMatcher>& match) {
     matcher= match;
  }
  // Set confidence level
  void setConfidenceLevel(
         double conf) {
     confidence= conf;
  }
  //Set MinDistanceToEpipolar
  void setMinDistanceToEpipolar(
         double dist) {
     distance= dist;
  }
  //Set ratio
  void setRatio(
         float rat) {
     ratio= rat;
  }

  // Clear matches for which NN ratio is > than threshold
  // return the number of removed points
  // (corresponding entries being cleared,
  // i.e. size will be 0)
  int ratioTest(std::vector<std::vector<cv::DMatch> >
                                               &matches) {
    int removed=0;
      // for all matches
    for (std::vector<std::vector<cv::DMatch> >::iterator
             matchIterator= matches.begin();
         matchIterator!= matches.end(); ++matchIterator) {
           // if 2 NN has been identified
           if (matchIterator->size() > 1) {
               // check distance ratio
               if ((*matchIterator)[0].distance/
                   (*matchIterator)[1].distance > ratio) {
                  matchIterator->clear(); // remove match
                  removed++;
               }
           } else { // does not have 2 neighbours
               matchIterator->clear(); // remove match
               removed++;
           }
    }
    return removed;
  }

  // Insert symmetrical matches in symMatches vector
  void symmetryTest(
      const std::vector<std::vector<cv::DMatch> >& matches1,
      const std::vector<std::vector<cv::DMatch> >& matches2,
      std::vector<cv::DMatch>& symMatches) {
    // for all matches image 1 -> image 2
    for (std::vector<std::vector<cv::DMatch> >::
             const_iterator matchIterator1= matches1.begin();
         matchIterator1!= matches1.end(); ++matchIterator1) {
       // ignore deleted matches
       if (matchIterator1->size() < 2)
           continue;
       // for all matches image 2 -> image 1
       for (std::vector<std::vector<cv::DMatch> >::
          const_iterator matchIterator2= matches2.begin();
           matchIterator2!= matches2.end();
           ++matchIterator2) {
           // ignore deleted matches
           if (matchIterator2->size() < 2)
              continue;
           // Match symmetry test
           if ((*matchIterator1)[0].queryIdx ==
               (*matchIterator2)[0].trainIdx &&
               (*matchIterator2)[0].queryIdx ==
               (*matchIterator1)[0].trainIdx) {
               // add symmetrical match
                 symMatches.push_back(
                   cv::DMatch((*matchIterator1)[0].queryIdx,
                             (*matchIterator1)[0].trainIdx,
                             (*matchIterator1)[0].distance));
                 break; // next match in image 1 -> image 2
           }
       }
    }
  }

  // Identify good matches using RANSAC
  // Return fundemental matrix
  cv::Mat ransacTest(
      const std::vector<cv::DMatch>& matches,
      const std::vector<cv::KeyPoint>& keypoints1,
      const std::vector<cv::KeyPoint>& keypoints2,
      std::vector<cv::DMatch>& outMatches) {
   // Convert keypoints into Point2f
   std::vector<cv::Point2f> points1, points2;
   cv::Mat fundemental;
   for (std::vector<cv::DMatch>::
         const_iterator it= matches.begin();
       it!= matches.end(); ++it) {
       // Get the position of left keypoints
       float x= keypoints1[it->queryIdx].pt.x;
       float y= keypoints1[it->queryIdx].pt.y;
       points1.push_back(cv::Point2f(x,y));
       // Get the position of right keypoints
       x= keypoints2[it->trainIdx].pt.x;
       y= keypoints2[it->trainIdx].pt.y;
       points2.push_back(cv::Point2f(x,y));
    }
   // Compute F matrix using RANSAC
   std::vector<uchar> inliers(points1.size(),0);
   if (points1.size()>0&&points2.size()>0){
      cv::Mat fundemental= cv::findFundamentalMat(
         cv::Mat(points1),cv::Mat(points2), // matching points
          inliers,       // match status (inlier or outlier)
          CV_FM_RANSAC, // RANSAC method
          distance,      // distance to epipolar line
          confidence); // confidence probability
      // extract the surviving (inliers) matches
      std::vector<uchar>::const_iterator
                         itIn= inliers.begin();
      std::vector<cv::DMatch>::const_iterator
                         itM= matches.begin();
      // for all matches
      for ( ;itIn!= inliers.end(); ++itIn, ++itM) {
         if (*itIn) { // it is a valid match
             outMatches.push_back(*itM);
          }
       }
       if (refineF) {
       // The F matrix will be recomputed with
       // all accepted matches
          // Convert keypoints into Point2f
          // for final F computation
          points1.clear();
          points2.clear();
          for (std::vector<cv::DMatch>::
                 const_iterator it= outMatches.begin();
              it!= outMatches.end(); ++it) {
              // Get the position of left keypoints
              float x= keypoints1[it->queryIdx].pt.x;
              float y= keypoints1[it->queryIdx].pt.y;
              points1.push_back(cv::Point2f(x,y));
              // Get the position of right keypoints
              x= keypoints2[it->trainIdx].pt.x;
              y= keypoints2[it->trainIdx].pt.y;
              points2.push_back(cv::Point2f(x,y));
          }
          // Compute 8-point F from all accepted matches
          if (points1.size()>0&&points2.size()>0){
             fundemental= cv::findFundamentalMat(
                cv::Mat(points1),cv::Mat(points2), // matches
                CV_FM_8POINT); // 8-point method
          }
       }
    }
    return fundemental;
  }

  // Match feature points using symmetry test and RANSAC
  // returns fundemental matrix
  cv::Mat match(cv::Mat& image1,
                cv::Mat& image2, // input images
     // output matches and keypoints
     std::vector<cv::DMatch>& matches,
     std::vector<cv::KeyPoint>& keypoints1,
     std::vector<cv::KeyPoint>& keypoints2) {
   // 1a. Detection of the SURF features
   detector->detect(image1,keypoints1);
   detector->detect(image2,keypoints2);
   // 1b. Extraction of the SURF descriptors
   cv::Mat descriptors1, descriptors2;
   extractor->compute(image1,keypoints1,descriptors1);
   extractor->compute(image2,keypoints2,descriptors2);
   // 2. Match the two image descriptors
   // Construction of the matcher
   //cv::BruteForceMatcher<cv::L2<float>> matcher;
   // from image 1 to image 2
   // based on k nearest neighbours (with k=2)
   std::vector<std::vector<cv::DMatch> > matches1;
   matcher->knnMatch(descriptors1,descriptors2,
       matches1, // vector of matches (up to 2 per entry)
       2);        // return 2 nearest neighbours
    // from image 2 to image 1
    // based on k nearest neighbours (with k=2)
    std::vector<std::vector<cv::DMatch> > matches2;
    matcher->knnMatch(descriptors2,descriptors1,
       matches2, // vector of matches (up to 2 per entry)
       2);        // return 2 nearest neighbours
    // 3. Remove matches for which NN ratio is
    // > than threshold
    // clean image 1 -> image 2 matches
    int removed= ratioTest(matches1);
    // clean image 2 -> image 1 matches
    removed= ratioTest(matches2);
    // 4. Remove non-symmetrical matches
    std::vector<cv::DMatch> symMatches;
    symmetryTest(matches1,matches2,symMatches);
    // 5. Validate matches using RANSAC
    cv::Mat fundemental= ransacTest(symMatches,
                keypoints1, keypoints2, matches);
    // return the found fundemental matrix
    return fundemental;
  }
};


// set parameters

int numKeyPoints = 1500;

//Instantiate robust matcher

RobustMatcher rmatcher;

//instantiate detector, extractor, matcher

detector = new cv::OrbFeatureDetector(numKeyPoints);
extractor = new cv::OrbDescriptorExtractor;
matcher = new cv::BruteForceMatcher<cv::HammingLUT>;

rmatcher.setFeatureDetector(detector);
rmatcher.setDescriptorExtractor(extractor);
rmatcher.setDescriptorMatcher(matcher);

//Load input image detect keypoints

cv::Mat img1;
std::vector<cv::KeyPoint> img1_keypoints;
cv::Mat img1_descriptors;
cv::Mat img2;
std::vector<cv::KeyPoint> img2_keypoints
cv::Mat img2_descriptors;
std::vector<std::vector<cv::DMatch> > matches;
img1 = cv::imread(fList[0].string(), CV_LOAD_IMAGE_GRAYSCALE);
img2 = cv::imread(fList[1].string(), CV_LOAD_IMAGE_GRAYSCALE);

rmatcher.match(img1, img2, matches, img1_keypoints, img2_keypoints);

1
你的姓氏与SIFT开发者的姓氏相同,你是David Lowe的儿子吗? :) 我也对匹配算法的鲁棒性感兴趣,我在这里看到的与流行的knn+比率测试唯一的区别是对称性测试 - 它是否提供了显著的鲁棒性? - happy_marmoset
1
哈哈,跟David Lowe没关系 :) 我发现加上 symmetryTest 和 ransacTest 后可以得到明显更好的结果。虽然性能有一定下降,但在我目前不那么敏感于性能的环境中并不是什么问题。 - KLowe
1
你会如何建议对结果进行评分?我想在我的整个索引上运行此代码并找到最佳匹配。我应该计算过滤匹配后的关键点数量,或将所有距离加在一起,还是取距离的平均值?我不知道什么样的标准会更好。 - Hacky

9
我在使用opencv python时遇到了类似的问题,通过谷歌来到了这里。
为了解决我的问题,我根据@KLowes的解决方案编写了基于匹配过滤的python代码。我将在这里分享它,以防其他人遇到同样的问题:
""" Clear matches for which NN ratio is > than threshold """
def filter_distance(matches):
    dist = [m.distance for m in matches]
    thres_dist = (sum(dist) / len(dist)) * ratio

    sel_matches = [m for m in matches if m.distance < thres_dist]
    #print '#selected matches:%d (out of %d)' % (len(sel_matches), len(matches))
    return sel_matches

""" keep only symmetric matches """
def filter_asymmetric(matches, matches2, k_scene, k_ftr):
    sel_matches = []
    for match1 in matches:
        for match2 in matches2:
            if match1.queryIdx < len(k_ftr) and match2.queryIdx < len(k_scene) and \
                match2.trainIdx < len(k_ftr) and match1.trainIdx < len(k_scene) and \
                            k_ftr[match1.queryIdx] == k_ftr[match2.trainIdx] and \
                            k_scene[match1.trainIdx] == k_scene[match2.queryIdx]:
                sel_matches.append(match1)
                break
    return sel_matches

def filter_ransac(matches, kp_scene, kp_ftr, countIterations=2):
    if countIterations < 1 or len(kp_scene) < minimalCountForHomography:
        return matches

    p_scene = []
    p_ftr = []
    for m in matches:
        p_scene.append(kp_scene[m.queryIdx].pt)
        p_ftr.append(kp_ftr[m.trainIdx].pt)

    if len(p_scene) < minimalCountForHomography:
        return None

    F, mask = cv2.findFundamentalMat(np.float32(p_ftr), np.float32(p_scene), cv2.FM_RANSAC)
    sel_matches = []
    for m, status in zip(matches, mask):
        if status:
            sel_matches.append(m)

    #print '#ransac selected matches:%d (out of %d)' % (len(sel_matches), len(matches))

    return filter_ransac(sel_matches, kp_scene, kp_ftr, countIterations-1)



def filter_matches(matches, matches2, k_scene, k_ftr):
    matches = filter_distance(matches)
    matches2 = filter_distance(matches2)
    matchesSym = filter_asymmetric(matches, matches2, k_scene, k_ftr)
    if len(k_scene) >= minimalCountForHomography:
        return filter_ransac(matchesSym, k_scene, k_ftr)

要筛选匹配项,需要调用filter_matches(matches, matches2, k_scene, k_ftr)函数,其中matches, matches2表示使用ORB匹配器获得的匹配项,而k_scene, k_ftr则是对应的关键点。

谢谢,太棒了!我正准备写一个小的Python OpenCV脚本来使用特征匹配,你帮我省去了移植的麻烦! - KLowe

1

我认为你的代码没有什么大问题。根据我的经验,OpenCV的ORB对尺度变化比较敏感。

你可以通过一个小测试来确认这一点,制作一些只有旋转的图像和一些只有尺度变化的图像。旋转的图像可能会匹配得很好,但尺度变化的图像不会(我认为缩小尺度是最糟糕的)。

我还建议你尝试从主干获取OpenCV版本(请参见OpenCV网站上的编译说明),自2.3.1以来,ORB已经更新并且表现略有提高,但仍存在这些尺度问题。


谢谢您的见解。我会测试更多的特征检测器。我一直避免使用SIFT和SURF,因为据我所知它们都是有专利的。您有其他推荐的特征检测器吗? - KLowe
SIFT和SURF比ORB要慢得多。如果我真的需要SURF的准确性并且是为台式电脑编程(我正在为移动设备编程),我会尝试SURF GPU版本(OpenCV有一个使用GPU的SURF实现,也有ORB的实现)以查看是否足够快。还有FAST检测器,它很快但不太准确,还有BRIEF检测器。BRIEF不具有旋转不变性,但可以通过提供多个旋转查询图像来解决这个问题(我建议阅读此网站及其代码,以了解BRIEF的使用方法:http://cvlab.epfl.ch/software/brief/index.php)。 - Rui Marques
对于我的目的来说,问题主要在过滤方面。我找到了另一个Stack Overflow答案,它提到了书籍《OpenCV 2计算机视觉应用程序编程食谱》第9章:使用随机样本一致性匹配图像。基本上,他们不仅仅是剔除所有距离低于给定距离的匹配项,而是使用3种不同的过滤器,让我得到更多的好匹配项。以前,我会删除所有距离小于15.0的匹配项,这样会留下所有好的匹配项,但是在这个过程中我也会剔除很多好的匹配项。 - KLowe
好的,我会看一下。你能分享一下你对代码所做的更正吗?这样你就可以回答自己的问题了 ;) - Rui Marques

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接