Open GL code not being sent to gpu?!

Hi, please please please help!

My openGL xcode project isnt using the graphics card! Its only using the cpu, which once the graphics get more intense slows the program right down. I checked this is def the case by downloading ‘atMonitor’ and sure enough cpu usuage goes up and gpu useage remains at 0%…

The code runs fine and v fast under linux, however I dont think its a porting issue as i tried setting up a new very basic draw a triangle Xcode openGL project and that doesnt use the gpu either…

It compiles and runs fine - xcode cocoa project, added openGL and GLUT frameworks fine…

Strangely this problem doesnt seem to happen to anyone else, as googling doesnt come up with any forum responses to similar issues…

I’ve tried this code on 3 different macs though - one of which is the newest mac book pro there is -and it still runs the graphics badly and incredible slowly without using the graphics card…

could someone explain - or perhaps send me a really basic one file bit of openGL code (draw a triangle for example) that you think does work by sending to the graphics card using the openGL and GLUT frameworks then I could test that out and hack if it works?

Thankyou very much!!!

Kate :slight_smile:

Without more details it’s hard to help specifically, but… if you google there is exactly what you seek in the GLUT xcode project.

One of the regular posters here has also done a help page on this…

It’s a little old, but you should be able to work your way through it.

hiya, thanks… that’s exactly as i’d set my code up… just set up this one and again it works fine but my performance monitor says only the cpu is being used. Of course for small programs this makes no visual difference, but my large simulation package I developed on linux is v slow and jumpy when run on here, and saying gpu not used… cpu only.

Interestingly i’ve just downloaded a free openGL game, rubberNinjas from apple website and that also runs slowly and jumpy and atMonitor shows no gpu useage…

I’ve since read a load of articles saying mac book pros last year came with faulty graphics cards… hmmm maybe the gpu just isnt working - though seems unlikely that all three macs i’ve tested this on have faulty graphics cards… particularly the brand spanking new one.

Oh I dont know what to do short of testing the graphic hardware and dont have the disks - dont suppose theres some way of seeing where Xcode is sending the graphics commands? Or some link in the frameworks to tell it to send to gpu that might be faulty?

Or do you feel like downloading atMonitor for free and just seeing if you can get it to give a gpu useage reading when you run some code? Maybe the monitor thing doesnt work!

Kate :slight_smile:

It certainly sounds very strange that this problem would be occurring on more than one Mac, let alone one new MBP… I would think that if the GPU is really not working you’d have a lot more problems on your hands with day to day use of the machine itself… Are you saying that any commercial app has the same problem on your machine? btw on the new MBPs there are two GPUs. In Prefs you can toggle between them by changing the Power Saving Settings. It will require you to log in and out… But to be honest both the GPUs are pretty good so I wouldn’t expect you to see a difference in normal usage, but it may be worth a try to see what happens on your machine…

I am not familiar with atMonitor. Have you tried running your app in the OpenGL Profiler? This is actually for checking what the GPU is up to, and will show you if / where your app is dropping back to SW. You can even set breakpoints so that it stops at the exact location / instruction that happens.

OpenGL Profiler is part of the Dev Tools that come with XCode.

Another cool app that comes with a demo period of 30 days is gDEBugger which I recently Beta tested. This has a very nice way of showing both individual core loadings as well as GPU loadings when you are running OpenGL apps.

Read The FAQ.

Query GL_RENDERER after your context is created to see if you got a hardware renderer.
Query kCGLCPGPUFragmentProcessing after drawing to see if your drew on the GPU.

Most likely, your app is using the GPU. But it is either too simple to show much GPU time being used, or you are using slow API like glBegin which have a lot of CPU overhead.

atMonitor does show GPU usage on my MacBookPro (GeForce 8600) if I run any “real” application, like World of Warcraft.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.