OpenCV calculate time detection features

9.9k Views Asked by At

I'm trying to calculate the time that my program takes to detect the keypoints from an image. If in my c++ program I do it two times (with the same image) there is a huge difference between both. The first time it uses around 600-800 ms and the second time just 100-200 ms.

Does anyone know what is happening?

Here is the code where I get the times:

struct timeval t1, t2;

Ptr<SURF> detector = SURF::create(400);

gettimeofday(&t1, 0x0);

detector->detect( imagen1, keypoints_1 );

gettimeofday(&t2, 0x0);

int milliSeconds = Utils::calculateDiff(t1, t2);

Here is the code where I calculate the diff:

static int calculateDiff(timeval t1, timeval t2)
{
    return (t2.tv_sec - t1.tv_sec) * 1000 + (t2.tv_usec - t1.tv_usec)/1000;
}

Here is a sample:

Sample

2

There are 2 best solutions below

4
On

note, that gettimeofday is using wall-time, while problems like this usually require cpu/clock-time.

for profiling, try something (even more portable), like this:

int64 t0 = cv::getTickCount();
//
// some lengthy op.
//
int64 t1 = cv::getTickCount();
double secs = (t1-t0)/cv::getTickFrequency();
0
On

you can use getTickCount(), and getTickFrequency() to count time. however there is a truncation problem when using these functions. After some tries this code worked for me:

    long double execTime, prevCount, time;
    execTime = prevCount = time = 0;

    for (;;)
    {
    prevCount = getTickCount() * 1.0000;
    /*do image processing*/
    time += execTime;
    cout << "execTime = " << execTime << "; time = " << time << endl;
    execTime = (getTickCount()*1.0000 - prevCount) / (getTickFrequency() * 1.0000);
    }