opengl applications == max CPU usage

Hello,

I’ve just started with opengl coding. I used VC++ (visual studio 2005) as the ide. I ran the first small program from a book I’m reading. The program then takes up 99% cpu usage or so and heats up my computer :frowning: Its a real simple program to draw a triangle.
Now it strikes me that even Google Earth in opengl mode does not run, increases the cpu usage by googleearth to 99% or so and Counter-Strike 1.5 in opengl mode crashes my system.
Why would my system possibly reject opengl?!?

My graphix card is ATi Mobility Radeon 9700 (128mb) with the latest omega drivers 3.8.273

If I understand you correctly you mean that Google Earth does not take 99% CPU time.
Usually an OpenGL application just renders as fast as your computer can handle it, therefore the high cpu usage. I think it is common practice to add some “sleep” command inside your rendering loop which will lower the cpu usage.

Also I think that Google Earth only renders a new frame if the user changes the view by input with the mouse or keyboard so it uses nearly no cpu time because the operating system can reuse the frame without relying on the OpenGL program to render it again.

Thanks for your reply.
No, Google Earth in opengl mode does take 99% CPU usage… And, Google Earth doesnt even start completely, that is, the black space (and stars) does not even appear, the program sort of hangs before anysort of rendering begins (while taking 99% CPU usage)

Even the triangle program I mentioned before, nothin appears at all. It just takes up 99% CPU usage and renders nothing.

Here is the main of the program I’m trying to run. I’ve commented the function where it (apparently) hangs.

int main( int argc, char** argv )
{
printf("Whats happening?!? 1
");
glutInit( &argc, argv );

printf("Whats happening?!? 2

");
// Create a single GLUT window, 300x300 pixels, RGB mode,
// and double-buffered. Call it “Simple Example”.
glutInitDisplayMode( GLUT_RGB | GLUT_DOUBLE );
printf("Whats happening?!? 3
");
glutInitWindowSize( 300, 300 );

// It prints this, and 'hangs' at the next function
printf("Whats happening?!? 4

");
glutCreateWindow( “Simple Example” );

// Nothing happens beyond this point
printf("Whats happening?!? 5

");
init();
printf("Whats happening?!? 6
");

// Loop for events.
glutMainLoop();
printf("Whats happening?!? 7

");

return 0;

}

If I remember correctly, there was a problem JOGL had a memory leak problem with ATi Mobility Radeon 9700 and I was wondering if this had anything to do with this

scratch that…

I just updated the omega drivers from 3.8.273 to 3.8.291 and everything seems to be working perfectly now…

no idea what was wrong… thanks anyway