I’ve had a number of people tell me that my demo doesnt work on Geforce cards, so I had no choice but to go and buy a Geforce and debug it :/. The following code was cauing the problem:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGB);
Notice anything intrinsically wrong with this code?
the program was hanging as soon as anything was drawn with drawelements.
To cut a long story short the problem was more specifically to do with the colour array and the fact that each vertex required 3 bytes for colour and not 4. As soon as I defined the color pointer to be a float, it worked but of course it slowed my app down. In the end I got round it by sticking with unsigned bytes for colour and using RGBA mode, this maintained 4 byte alignment. This doesnt just apply to the colour array the same happens if you use GL_INT for the normal arrays. 2*3=6 6/4 = 1.5 = crash.
Now as far as I know everyone with a mainstream card apart from the GF has been able to run my app so this is specific to GeForce cards and probably to do with the way they have implemented HW T&L. Just thought I should let you guys know.
As an aside , does anyone know if/how you can remove the DOS box in GLUT apps, also can GLUT apps be run fullscreen as opposed to maximsed. Thanks