same drawing in 1080p and then 720p and conversely


in a timed sequence, I first make some drawings in a glViewPort and with given MODELVIEW and PROJECTION matrixes.
And then I put this same drawing in a buffer with glReadPixels.

But I need it to be in another size, i.e. I want to see the same elements (text at the same position, etc.) drawn first in a 1280x720 definition and then with a 1920x1080 definition. (or conversely or with a custom size entered by user).

What should I do before calling glReadPixels ?
I tried glScale, or just changing the glViewPort,
but what I want is the exact same image but with a different definition after it has first been drawn.

Hoping my question is not too confused.
Thanks to all in advance.

You can redraw the scene with glViewport 0, 0, 1080, 500) and setup your projection matrix and render your scene.

Or, you can rescale the image you obtained with glReadPixels. You can use gluScaleImage or my own libraries glhScaleImage which is faster.

Ok thanks. So I tried the function gluScaleImage but it is too long (and I don’t achieve to include “glh” lib in my project (to use glhScaleImage)).

Meanwhile, I’ve read here :
this advice :
" If this is just about displaying images with various sizes, why not create a textured quad and scale it to the requested size? The texture could always be the same size then."

And that is what I try to do with the following code :
the result is that I see well written “BACK” in red but don’t see “before glflush” and the background has a strange color (sometimes purple, sometimes green (concerning the green issue i suspect it to be a wrong initialization of the graphic card))
=> can anyone tell me what is wrong in my texturing of the buffer obtained with glReadPixels ?
=> or is there another solution ?
(I would prefer a correction of this code than another solution (framebuffer, …)).

// —
myPrintingFunction(200,200,“before glFlush”);

// Unload Bytes
if (mBuffer != NULL)
delete[] mBuffer;
mBuffer = NULL;

// Load Bytes from current window
mBuffer = new GLubyte[ClientWidth * ClientHeight * 4];
glReadPixels(0, 0, ClientWidth, ClientHeight, GL_BGRA, GL_UNSIGNED_BYTE, mBuffer);


// Unload TextureId
if (mTextureId != 0)
glDeleteTextures(1, &mTextureId);
mTextureId = 0;

// Load TextureId
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glGenTextures(1, &mTextureId);
glBindTexture(GL_TEXTURE_2D, mTextureId);
glTexImage2D(GL_TEXTURE_2D, 0, GL_BGRA, ClientWidth, ClientHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, mBuffer);

// —
glViewport(-mBufferWidth, -mBufferHeight, mBufferWidth * 2, mBufferHeight * 2);

glOrtho(double(-mBufferWidth), double(mBufferWidth), double(-mBufferHeight), double(mBufferHeight), 1000.0, 100000.0);


glTranslatef(0.0, 0.0, -50000.0);



glBindTexture(GL_TEXTURE_2D, mTextureId);
mxIncrust0 = 0.0;
myIncrust0 = 0.0;
mxIncrust1 = mBufferWidth;
myIncrust1 = mBufferHeight;
glColor4ub((unsigned char)255,(unsigned char)255,(unsigned char)255,(unsigned char)255);
glTexCoord2d(0.0, 0.0);
glVertex2d(mxIncrust0, myIncrust0);
glTexCoord2d(0.0, 1.0);
glVertex2d(mxIncrust0, myIncrust1);
glVertex2d(mxIncrust1, myIncrust1);
glVertex2d(mxIncrust1, myIncrust0);




if (mBufferPlayout != NULL)
delete[] mBufferPlayout;
mBufferPlayout = NULL;
mBufferPlayout = new GLubyte [mBufferWidth * mBufferHeight * 4];

glReadPixels(0, 0, mBufferWidth, mBufferHeight, GL_BGRA, GL_UNSIGNED_BYTE, mBufferPlayout);

// then I pass this mBufferPlayout to a graphic card (capture card in output in fact)

Thanks for any help, cheers, Arnaud.

I didn’t see any code on that page.

Also, this line is invalid
glTexImage2D(GL_TEXTURE_2D, 0, GL_BGRA, ClientWidth, ClientHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, mBuffer);

The first GL_BGRA is not a valid internal format. It should be GL_RGBA8.

Also, I suggest to never use GL_CLAMP. Use GL_CLAMP_TO_EDGE.

Thank you.

I had found the problem too and replaced the non-existing GL_BGRA with GL_RGBA, and I have my texture well bound to the quad (but still the greenish taint issue : but this must be another problem : initialization of my frame with YUV ?).

I tried GL_RGBA8 but it gives me “error 1280” when I call glGetError() just after glTexImage2D.

Thanks for the “GL_CLAMP_TO_EDGE” : the other parts of my application was all with “GL_CLAMP_TO_EDGE” so I replaced here “GL_CLAMP” with “GL_CLAMP_TO_EDGE” as you advise and explained here :
Texture edge color problem

If you want to clamp your texture fetches, use GL_CLAMP_TO_EDGE, not GL_CLAMP. GL_CLAMP_TO_EDGE means that the colors outside of the texture range are the color of the nearest texel in the texture. Whereas GL_CLAMP means that the colors outside of the texture range are the border color. This is usually not what you want, and can lead to black borders around your texture (since the border color is black).

I would not know why it would cause a 1280, whatever that is.
GL_RGBA8 has probably been supported since GL 1.1 or 1.2.

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, pixels);

or the GL_BGRA version if you source data is flipped
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_BGRA, GL_UNSIGNED_BYTE, pixels);

The internal format always stays GL_RGBA8. The Wiki had an article on internal formats.

Correct, exactly, thanks again.

I found also the opengl article concerning this format choice,
and chose this :
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_BGRA, GL_UNSIGNED_BYTE, pixels);

I will now make a new post for my green tinge issue.

Thanks again.