I'm trying to calculate the time that my program takes to detect the keypoints from an image. If in my c++ program I do it two times (with the same image) there is a huge difference between both. The first time it uses around 600-800 ms and the second time just 100-200 ms.
Does anyone know what is happening?
Here is the code where I get the times:
struct timeval t1, t2;
Ptr<SURF> detector = SURF::create(400);
gettimeofday(&t1, 0x0);
detector->detect( imagen1, keypoints_1 );
gettimeofday(&t2, 0x0);
int milliSeconds = Utils::calculateDiff(t1, t2);
Here is the code where I calculate the diff:
static int calculateDiff(timeval t1, timeval t2)
{
return (t2.tv_sec - t1.tv_sec) * 1000 + (t2.tv_usec - t1.tv_usec)/1000;
}
Here is a sample:
note, that gettimeofday is using wall-time, while problems like this usually require cpu/clock-time.
for profiling, try something (even more portable), like this: