Measurement time for graph processing

I’m trying to measure time for graph processing. But it’s variation is too large.
Why is it happen? And please tell me the better way to measure the time.

Part of code:

status = vxVerifyGraph(graph);
if (status == VX_SUCCESS)
{
    int64 start = cv::getTickCount();
    status = vxProcessGraph(graph);
    int64 end = cv::getTickCount();
    diff = end - start;
    std::cout << "Time:" << diff << std::endl;
}

result

1. 2348981 ns
2. 1117763 ns
3. 775396 ns

The vxVerifyGraph call should only be used once, its more an initialization and verification step. The first process graph usually takes longer than consecutive calls. Use a structure like below.

status = vxVerifyGraph(graph);
if (status != VX_SUCCESS){ exit }
status = vxProcessGraph(graph);
if (status != VX_SUCCESS){ exit }
int N = 100;
t0 = clockCounter();
        for(int i = 0; i < N; i++) {
            status = vxProcessGraph(graph);
            if(status != VX_SUCCESS)
                break;
        }
t1 = clockCounter();
Time = (float)(t1-t0)*1000.0f/(float)freq/(float)N;
printf("OK: Graph took %.3f msec (average over %d iterations)\n", (float)(t1-t0)*1000.0f/(float)freq/(float)N, N);

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.