I’m trying to measure time for graph processing. But it’s variation is too large.
Why is it happen? And please tell me the better way to measure the time.
Part of code:
status = vxVerifyGraph(graph);
if (status == VX_SUCCESS)
{
int64 start = cv::getTickCount();
status = vxProcessGraph(graph);
int64 end = cv::getTickCount();
diff = end - start;
std::cout << "Time:" << diff << std::endl;
}
The vxVerifyGraph call should only be used once, its more an initialization and verification step. The first process graph usually takes longer than consecutive calls. Use a structure like below.
status = vxVerifyGraph(graph);
if (status != VX_SUCCESS){ exit }
status = vxProcessGraph(graph);
if (status != VX_SUCCESS){ exit }
int N = 100;
t0 = clockCounter();
for(int i = 0; i < N; i++) {
status = vxProcessGraph(graph);
if(status != VX_SUCCESS)
break;
}
t1 = clockCounter();
Time = (float)(t1-t0)*1000.0f/(float)freq/(float)N;
printf("OK: Graph took %.3f msec (average over %d iterations)\n", (float)(t1-t0)*1000.0f/(float)freq/(float)N, N);