Hello all,
I have 3 PBOs for the Y, U, and V data of a video decoder. I’m converting these to RGB in a frag shader and using it as my frag color.
I’m using this:
#define PBO_BUFFER_OFFSET(i) ((char *)NULL + (i))
in order to pass the tex data to glTexSubImage2D.
It turns out, although this works when i=0, it throws an invalid op (1282) for i=1 and i=2.
The only difference I’ve seen in my code, and other code on the web that uses PBOs, is that mine uses LUMINANCE textures with GL_UNSIGNED_BYTE (normaly ppl have RGBs and GL_FLOAT).
I’m guessing that would mess up the offseting (???)
Anyway, here’s the code (for the Y data…it is identical for U and V data, with the PBO_BUFFER_OFFSET being 1 and 2 respectively. Obviously U ans V data is half the width and height, but that doesn’t really matter here).
// Y on tex unit 0
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, mTexID_YUV[0]);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
// we won’t have new data from the decoder every tick. just bind the old texture if we don’t have new data
if (mHasNewTexData)
{
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, mBufID_YUV[0]);
glBufferData(GL_PIXEL_UNPACK_BUFFER, widthheightsizeof(unsigned char), 0, GL_STREAM_DRAW);
void* memChunkY = glMapBuffer(GL_PIXEL_UNPACK_BUFFER, GL_WRITE_ONLY);
if (memChunkY != 0)
{
memcpy(memChunkY, decoder_data.y, widthheightsizeof(unsigned char));
}
glUnmapBuffer(GL_PIXEL_UNPACK_BUFFER);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, width, height, GL_LUMINANCE, GL_UNSIGNED_BYTE, PBO_BUFFER_OFFSET(0));
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0);
}
int sampler2D1 = glGetUniformLocation(mShaderProgram, “texY”);
glUniform1i(sampler2D1, 0);
Any ideas what could be wrong?
cheers,
g.