using time in c++

if i want to time how long a chunk of code takes to execute, how do i do this?

basically i wanna see how fast a certain algorithm is on my pentium III, and then on my mac g5.

The way im doing it now is:
long timeVar = clock();
float timeDelta = 1.00

public algorithmIWannaTime(){
timeDelta = timeDelta+1.00;

executionTime = timeVar + timeDelta * CLOCKS_PER_SECOND;

ok i have no idea how to do this, ive never done it before in code. obviously implementing it this way creates a number thats gonna appear the same regardless of what machine i run it on. has anyone done this before that could tell me how to use C++'s time functions to time a chunk of code?

(basically my input for the code that is to be timed is going to change a lot, and i wanna know how long it takes to execute on a smaller number vs a bigger number)


on windows you can use GetTickCount(); to time an event if you want a piece of code to run at the same speed on diferent computers take a look at this tutorial

I don’t know if there’s something C++ specific about it but I believe you would have to rely on C functions. I think I did something similar once with gettimeofday() in linux, but I’ve got a gut feeling it’s an OS function and not C’s so I don’t know about cross-platform with the mac. You could take a look at it though.

i dont need them to run the same, actually i want them to run differently (that is the same program on 2 machines) to see the difference.

im not sure how to use GetTickCount() properly, what if i just want to see how long a specific loop executes, and not an entire program?

i believe the is a c function i think is time() and difftime(), as i remember they are ansi functions so that will make them somewhat portable, if you want to be able to cross compile on diferent OS you could use for example for windows GetTickCount(), for linux gettimeday() (or whatever you use) and on mac, a mac specific function, and toggle between them using some ifdef’s

to find how long a piece of codes takes you can use something like this:

time_running = (end-start)/1000; // to get it in seconds

i cant use gettickcount on a mac :frowning:

im very new to mac so i have no idea what that function would be. i was googling it and couldnt find it either.

ok i think my question was unclear

im not very experienced with C++ either

this is how i do this in Java:

import java.util.Date;

Date clock;
long oldClock;

clock = new Date();
oldClock = clock.getTime();
doFunction( funcArg );
clock = new Date();
System.out.println("doFunction run time (msec.): " + (clock.getTime() - oldClock));


that would do what i want it to do in java. How do I do this same thing using the time.h functions in C++?

use the ansi C function for timing it, grab any ansi c tutorial and i suppose it will work on any c compilerthe function name is time()

look for glGet(GLUT_ELAPSED_TIME)

look for glGet(GLUT_ELAPSED_TIME)

really exact and good