First of all, I don’t actually know if my problem is related to OpenGL, but I suspect it might be. So if you want to hit me for using the wrong forum for this post, please do it softly.
I have a program using OpenGL to display a scene. It’s running fine on Linux and Windows, but I have problems getting it to work on Mac OS X. In some cases (seems to be related to the number of objects in the scene, but it’s not about a specific object being rendered), when I start the program, my screen starts to act weird. It’s flickering and I only see the upper half of the screen (sometimes more, sometimes less). It looks a bit like the GPU is drawing half of the screen and then clears it again to continue with the next frame. I can then move my mouse, but when I click somewhere, OS X does not seem to react. After some time, the whole system freezes and I have to forcefully restart the computer.
This might just be a problem in my program (it’s a prototype in development phase), but I wonder how an error in my program can cause the whole screen (not only the OpenGL window) to be screwed up, and the system to crash completely. I’ve already searched if there’s an obvious error in my program, but it’s not easy to trace this error down, because it seems to appear and disappear quite randomly, so I thought I’d ask here first, if anybody knows what could be happening.
I’m using a Mac Mini with OS X 10.6.7 Snow Leopard, using an Nvidia GeForce 9400.
Thanks in advance.