I just want to texturize objects which have cubemap texture coordinates with shaders.
Without shaders, this works well (all faces are rendered correctly), but with shaders, only 2 faces are correctly drawn (+Z and -Z), the others are badly rendered (many lines on the faces).
Can you show us your glBindTexture call for this too?
Also check your texture specification: I’ve found one driver in the past where cubemaps go NUTS unless you specify the faces in +x/-x/+y/-y/+z/-z order.
glfreak, it didn’t help (I’m not using gl >= 3.0).
mhagain, here it is (it’s a deduction from quite complicate code):
glBindTexture (target, id);
where target is GL_TEXTURE_CUBE_MAP (I ensured it). I also have kept the glEnable (GL_TEXTURE_CUBE_MAP). I tried to remove the latter but with no more luck.
Images are also loaded in the order you wrote.
From what I understand from your posts, it seems my code is right. Maybe as often there’s some kind of bug in my code, but since everything else works fine for texturing, and that it works well without shaders, I highly doubt.
The only other thing I can think of is about the uniforms. I send it as a simple integer. Is that enough ?
It works well when shaders are disabled and I use the old vertex pointer functions.
It works bad (only 2 faces are good) when shaders are enabled and I use vertex attributes.
It does not work at all (orange color that is the main color of the skybox image set) when shaders are enabled and I use old vertex pointer functions.
yeah, the GLSL spec says it doesn’t use {s,t,r,q} as the OpenGL spec does, because “r” would collide with the “r” from {r, g, b, a}. And it doesn’t allow mix + matching of the different indexing terms.