I’m trying to do a YUV to RGB conversion on my frag shader…It looks something like:
uniform sampler2D texY, texU, texV;
nx=gl_TexCoord.x; ny=gl_TexCoord.y; y=texture2D(texY,vec2(nx,ny)).r; u=texture2D(texU,vec2(nx/2.0,ny/2.0)).r; v=texture2D(texV,vec2(nx/2.0,ny/2.0)).r; y=1.1643*(y-0.0625); u=u-0.5; v=v-0.5; r=y+1.5958*v; g=y-0.39173*u-0.81290*v; b=y+2.017*u; gl_FragColor=vec4(r,g,b,1.0);
I’m passing the 3 textures in as luminance (and as you saw in the frag shader I’m only reading the r component). The 3 textures are passed in like this:
int sampler2D1 = glGetUniformLocation(mShaderProgram, “texY”);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, bufY);
Similar for U and V BUT with width being width/2 and height being height/2 (for both U and V).
To my knowledge, everything should work fine…When I just display the luminance, I see my texture the way it’s meant to look. But when I display my U and V, they look distorted.
Am I right to assume that with openGL 2.0 I can use non powe-of-two-textures? The texture in question is 240x176. Is gl_TexCoord going to behave the way it’s meant to for a 240x176 texture?
(I’m assuming yes because as I said earlier my luminance works fine…I just want to confirm before I carry on looking into what’s gone wrong)