Getting exception with Pattern recognition algorithm SURF, SIFT in OpenCV for ANDROID

StackOverflow https://stackoverflow.com/questions/9836651

  •  26-05-2021
  •  | 
  •  

Pergunta

I want to implement a simple application (modification of sample2) which is showing what SIFT,SURF,BRIEF and ORB does. the user can simply compare the rotation or scale invariance or the speed. But I found fail, which I can not handle so I turn for your help. When I try to use SIFT or SURF I always get Exception on line when I try to match : matcherBruteForce.match(descriptorFrame, matches);

I have a similar AR application and with these settings It is working, so I cannot figure out where I m making mistake. I have tried to set the variable "matcherBruteForce" to BRUTEFORCE, BRUTEFORCE_L1, BRUTEFORCE_SL2 event to BRUTEFORCE_HAMMING. but I always got the same exceptions:

SIFT:

CvException [org.opencv.core.CvException: /home/andreyk/OpenCV2/trunk/opencv_2.3.1.b2/modules/features2d/include/opencv2/features2d/features2d.hpp:2455: error: (-215) DataType<ValueType>::type == matcher.trainDescCollection[iIdx].type() || matcher.trainDescCollection[iIdx].empty() in function static void cv::BruteForceMatcher<Distance>::commonKnnMatchImpl(cv::BruteForceMatcher<Distance>&, const cv::Mat&, std::vector<std::vector<cv::DMatch> >&, int, const std::vector<cv::Mat>&, bool) [with Distance = cv::SL2<float>]
]

SURF:

CvException [org.opencv.core.CvException: /home/andreyk/OpenCV2/trunk/opencv_2.3.1.b2/modules/features2d/include/opencv2/features2d/features2d.hpp:2455: error: (-215) DataType<ValueType>::type == matcher.trainDescCollection[iIdx].type() || matcher.trainDescCollection[iIdx].empty() in function static void cv::BruteForceMatcher<Distance>::commonKnnMatchImpl(cv::BruteForceMatcher<Distance>&, const cv::Mat&, std::vector<std::vector<cv::DMatch> >&, int, const std::vector<cv::Mat>&, bool) [with Distance = cv::SL2<float>]
]

any help appreciated

the whole class:

    package sk.bolyos.opencv;

import java.util.Vector;

import org.opencv.features2d.DMatch;
import org.opencv.features2d.DescriptorExtractor;
import org.opencv.features2d.DescriptorMatcher;
import org.opencv.features2d.FeatureDetector;
import org.opencv.features2d.Features2d;
import org.opencv.features2d.KeyPoint;
import org.opencv.highgui.VideoCapture;
import org.opencv.android.Utils;
import org.opencv.core.Mat;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;
import org.opencv.highgui.Highgui;

import sk.bolyos.svk.*;

import android.content.Context;
import android.graphics.Bitmap;
import android.util.Log;
import android.view.SurfaceHolder;




public class MyView extends CvViewBase {

    private static final int BOUNDARY = 35;

    private Mat mRgba;
    private Mat mGray;
    private Mat mIntermediateMat;
    private Mat mLogoMilka1,mLogoMilka2,mLogoMilka3,mLogoMilka4;
    ///////////////////DETECTORS
    FeatureDetector siftDetector = FeatureDetector.create(FeatureDetector.SIFT);
    FeatureDetector surfDetector = FeatureDetector.create(FeatureDetector.SURF);
    FeatureDetector fastDetector = FeatureDetector.create(FeatureDetector.FAST);
    FeatureDetector orbDetector = FeatureDetector.create(FeatureDetector.ORB);
    ///////////////////DESCRIPTORS
    DescriptorExtractor siftDescriptor = DescriptorExtractor.create(DescriptorExtractor.SIFT);
    DescriptorExtractor surfDescriptor = DescriptorExtractor.create(DescriptorExtractor.SURF);
    DescriptorExtractor briefDescriptor = DescriptorExtractor.create(DescriptorExtractor.BRIEF);
    DescriptorExtractor orbDescriptor = DescriptorExtractor.create(DescriptorExtractor.ORB);
    ///////////////////DATABASE
    Vector<KeyPoint> vectorMilka1 = new Vector<KeyPoint>();
    Vector<KeyPoint> vectorMilka2 = new Vector<KeyPoint>();
    Vector<KeyPoint> vectorMilka3 = new Vector<KeyPoint>();
    Vector<KeyPoint> vectorMilka4 = new Vector<KeyPoint>();
    Mat descriptorMilka1 = new Mat();
    Mat descriptorMilka2 = new Mat();
    Mat descriptorMilka3 = new Mat(); 
    Mat descriptorMilka4 = new Mat();
    ///////////////////VIDEO
    Vector<KeyPoint> vectorFrame = new Vector<KeyPoint>();
    Mat descriptorFrame = new Mat();

    DescriptorMatcher matcherHamming = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
    DescriptorMatcher matcherBruteForce = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_SL2);
    Vector<DMatch> matches = new Vector<DMatch>();
    Vector<Mat> siftDescriptors = new Vector<Mat>();
    Vector<Mat> surfDescriptors = new Vector<Mat>();
    Vector<Mat> briefDescriptors = new Vector<Mat>();
    Vector<Mat> orbDescriptors = new Vector<Mat>();

    public MyView(Context context) {
        super(context);
        // TODO Auto-generated constructor stub
        try{
            /*
            if (mLogoMilka1 == null){
                mLogoMilka1 = new Mat();
                mLogoMilka1 = Utils.loadResource(getContext(), R.drawable.milkalogo);
                fillDB(mLogoMilka1,vectorMilka1,descriptorMilka1);
            }
            if (mLogoMilka2 == null){
                mLogoMilka2 = new Mat();
                mLogoMilka2 = Utils.loadResource(getContext(), R.drawable.milkalogom);
                fillDB(mLogoMilka2,vectorMilka2,descriptorMilka2);
            }
            if (mLogoMilka3 == null){
                mLogoMilka3 = new Mat();
                mLogoMilka3 = Utils.loadResource(getContext(), R.drawable.milkalogol);
                fillDB(mLogoMilka3,vectorMilka3,descriptorMilka3);
            }*/
            if (mLogoMilka4 == null){
                mLogoMilka4 = new Mat();
                mLogoMilka4 = Utils.loadResource(getContext(), R.drawable.milkalogolc);
                fillDB(mLogoMilka4,vectorMilka4,descriptorMilka4);
            }

        }catch(Exception e){
            Log.e( "SVK APPLICATION", "in MyView constructor "+e.toString());
        }
    }

    public void fillDB(Mat mLogo,Vector<KeyPoint> vector,Mat descriptor){

      //SIFT 
        siftDetector.detect( mLogo, vector );
        siftDescriptor.compute(mLogo, vector, descriptor);
        siftDescriptors.add(descriptor);
      //SURF 
        surfDetector.detect( mLogo, vector );
        surfDescriptor.compute(mLogo, vector, descriptor);
        surfDescriptors.add(descriptor);
      //FAST+BRIEF 
        fastDetector.detect( mLogo, vector );
        briefDescriptor.compute(mLogo, vector, descriptor);
        briefDescriptors.add(descriptor);
      //ORB 
        orbDetector.detect( mLogo, vector );
        orbDescriptor.compute(mLogo, vector, descriptor);
        orbDescriptors.add(descriptor);

    }


    @Override
    public void surfaceChanged(SurfaceHolder _holder, int format, int width, int height) {
        super.surfaceChanged(_holder, format, width, height);

        synchronized (this) {
            // initialize Mats before usage
            mGray = new Mat();
            mRgba = new Mat();
            mIntermediateMat = new Mat();
            matches = new Vector<DMatch>();
            vectorFrame = new Vector<KeyPoint>();
            descriptorFrame = new Mat(); 
        }
    }

    @Override
    protected Bitmap processFrame(VideoCapture capture) {
        // TODO Auto-generated method stub
        switch (SVKApplikaciaActivity.viewMode) {
        case SVKApplikaciaActivity.VIEW_MODE_SIFT:
            //TODO SIFT
            try{
                //matcherBruteForce = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE);
                //matcherBruteForce.clear();
                matcherBruteForce.add(siftDescriptors);
                matcherBruteForce.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                siftDetector.detect( mGray, vectorFrame );
                siftDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherBruteForce.match(descriptorFrame, matches);  
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
            }catch(Exception e){
                Log.e( "SVK APPLICATION","in SIFT "+ e.toString());
            }
            break;
        case SVKApplikaciaActivity.VIEW_MODE_SURF:
            //TODO SURF
            try{
                //matcherBruteForce = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE);
                //matcherBruteForce.clear();
                matcherBruteForce.add(surfDescriptors);
                matcherBruteForce.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                surfDetector.detect( mGray, vectorFrame );
                surfDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherBruteForce.match(descriptorFrame, matches);  
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
            }catch(Exception e){
                Log.e( "SVK APPLICATION","in Surf "+ e.toString());
            }
            break;
        case SVKApplikaciaActivity.VIEW_MODE_BRIEF:
            //TODO BRIEF
            try{
                matcherHamming = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
                matcherHamming.add(briefDescriptors);
                matcherHamming.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                fastDetector.detect( mGray, vectorFrame );
                briefDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherHamming.match(descriptorFrame, matches); 
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
            }catch(Exception e){
                Log.e( "SVK APPLICATION","in Brief "+ e.toString());
            }
            break;
        case SVKApplikaciaActivity.VIEW_MODE_ORB:
            //TODO ORB
            try{
                matcherHamming = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
                matcherHamming.add(orbDescriptors);
                matcherHamming.train();// proba

                capture.retrieve(mGray, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
                Imgproc.resize(mGray, mGray,new Size(480,320)); 
                orbDetector.detect( mGray, vectorFrame );
                orbDescriptor.compute(mGray, vectorFrame, descriptorFrame);

                matcherHamming.match(descriptorFrame, matches); 
                Vector<DMatch> matchesXXX = new Vector<DMatch>();
                for (DMatch t : matches)
                    if(t.distance<BOUNDARY)
                        matchesXXX.add(t);
                Mat nGray = new Mat();
                Mat nLogo = new Mat();
                Mat nRgba = new Mat();
                Imgproc.cvtColor(mGray, nGray, Imgproc.COLOR_RGBA2RGB, 3);
                Imgproc.cvtColor(mLogoMilka4, nLogo, Imgproc.COLOR_RGBA2BGR, 3);
                Features2d.drawMatches(nGray, vectorFrame, nLogo, vectorMilka4, matchesXXX, nRgba);
                Imgproc.cvtColor(nRgba, mRgba, Imgproc.COLOR_RGB2RGBA, 4);
                }catch(Exception e){
                    Log.e( "SVK APPLICATION","in ORB "+ e.toString());
                }
            break;  
        case SVKApplikaciaActivity.VIEW_MODE_AR:
            //TODO AR
            break;    

        }

        Bitmap bmp = Bitmap.createBitmap(mRgba.cols(), mRgba.rows(), Bitmap.Config.ARGB_8888);

        if (Utils.matToBitmap(mRgba, bmp))
            return bmp;

        bmp.recycle();

        return null;
    }

    @Override
    public void run() {
        super.run();

        synchronized (this) {
            // Explicitly deallocate Mats
            if (mRgba != null)
                mRgba.release();
            if (mGray != null)
                mGray.release();
            if (mIntermediateMat != null)
                mIntermediateMat.release();

            mRgba = null;
            mGray = null;
            mIntermediateMat = null;
        }
    }

}
Foi útil?

Solução

I think i know the problem. The matcher you are using cannot be applied to SIFT and SURF descriptors. If you must use a DescriptorMatcher with sift or surf, you must set it as such

DescriptorMatcher matcherBruteForce=DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_SL2);

Since SURF and SIFT accept FloatBased descriptors exclusively, it will return an error, if you passed a DescriptorMatcher set to HAMMING to it.

Notice that in your code you have two DescriptorMatchers's , one set to BRUTEFORCE.SL2, and the other to HAMMING. Ensure you pass the correct one i.e the BRUTEFORCE.SL2 to SIFT or SURF.

Its however best to used FLANN based matchers for SIFT or SURF, since they extract a higher number of keypoints as compared to ORB, and FLANN is suited to large sets of keypoints Read more about it here http://computer-vision-talks.com/2011/07/comparison-of-the-opencvs-feature-detection-algorithms-ii/

and here http://opencv.willowgarage.com/documentation/cpp/flann_fast_approximate_nearest_neighbor_search.html

UPDATE: It is possible to use L2 or L1 distance for matching uchar descriptors. If you passed a DescriptorMatcher set to BRUTEFORCE it may work for ORB also (albeit with poor results)

Outras dicas

are you sure your size of vectorFrame isn't equal to zero? i think i had the same problem.. your problem would be in the detection algoritm, I think it returns a zero vectorFrame when the color code of your image isn't right

just put Log.e( "SVK APPLICATION","vectorFrame size = "+ vectorFrame.size()); somewhere

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top