GLSL - don't get shaders working

i know - glsl is not hardware accelerated yet but i want to learn glsl…

i have tried this tut:

since my own implementation did not work i have tried this one:
–> i have deleted the glew functions - rest of the code is the same

:eek: but it does not work too. shouldnt the program automatically change to software mode? what should i do to get it working? - i know there is an apple example about it but like alle apple examples it is far too huge… i have also tried to use OpenGL Profilers option to choose the driver - no results…

No compiler, linker problems just no shader results. i am using XCode with glut. i have also tried g++ blabla -framework OpenGL -framework GLUT…

Hardware (should not matter - hardware acceleration is not available anyway i suppose): GeForce FX 5200
Software: 10.4.0

:eek: :confused:

ahhh renderer is defined in pixelformat - so i can’t use glut anymore?

You can use the OpenGL profiler to force a particular pixel format, so you could use GLUT in that way. If that’s not acceptable (eg. you want to give the program to somebody) then you’ll need to use AGL or NSGL to be able to specify the renderer ID in the pixel format.

Remember to check for the presence of all four GLSL extensions before calling any GLSL-related functions.

since i am using my shaders the depth test does not work anymore…
why is that? depth test, alpha test and so on are done after fragment shader, right?

thanks OneSadCookie

That certainly should work, and seems to for the GLSLShowpiece example (/Developer/Examples/OpenGL/Cocoa/GLSLShowpiece). Did you remember to request a depth buffer?

Same pixelformat like in GLSLshowpiece. DepthTest enabled, depthFunc is GL_LEQUAL,

everything fine without the shader…

same bug with the shaders of GLSLshowpiece -> seems to be a setupproblem.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.