What will be the fragment shader for SNORM textures? If i want to use GL_TEXTURE_2D_ARRAY as a target then which type of sampler variable should be used (i,u) ?
I am getting enuexpected behaviour with following fragment shader:
in vec3 ps_texCoord;
uniform sampler2DArray tk_diffuseMap;
out vec4 fragColor;
fragColor = texture(tk_diffuseMap, ps_texCoord);
You have only three options for generic sampers: floating point, signed and unsigned. Each sampler type is either not prefixed (float), prefixed with [i]i /i or u (unsigned integer). Having never used SNORM internal formats to this day, I can only judge from the GLSL spec which states that normalized integers can be retrieved with a floating-point sampler.
What is it you expect your shader to put out? Are you aware how the texture() family of functions work and what the arguments mean and what the expected return values are?
Reading the GLSL spec, I find that it does only mention unsigned normalized integers which I don’t think was intended. Otherwise this would imply SNORM textures aren’t supposed to be accessible from a shader which defeats the purpose of having such internal formats.
I am using GL_byte as texture data type alongwith “sampler2DArray” variable. I tried with “isampler” variable too but did not get expected o/p.
just want to know which data type and sampler type will be accurate for SNORM textures.
As I said, floating-point samplers should work fine. What result do you expect and what result do you get? GL_BYTE denotes a signed integral type just like GL_SHORT or GL_INT - so as far as GLSL is concerned, there shouldn’t be a difference if your internal format is one of the SNORM formats. Are your tex coords correct? Do you get any GL errors?
I checked for GL_ERROR also verified my texcords. It works fine when i use GL_RGBA instead of snorm textures so their is problem with shader only.
There is difference in the intensity of color values when i use snorm textures. In case of gl_byte, it applies texture when value is 127 and if its < 127, it takes black.
Sorry it was driver bug. It works perfectly with GL_byte and “sampler”