Fast image scaling

Hello there,

i´m new to opengl and i have the following problem. I get images from a camera very fast up to 25 Frames per second. To show the images i use the Qt Toolkit and from then i use the QGLWidget.

My problem is that the images i get from the camera have a resolution from 1600 x 1200. My Widget that i use can be dynamicly resize and i have to scale the image to this widget.

To draw the image bits to the widget i use the following function:

glDrawPixels(m_image.width(), m_image.height(), GL_RGB, GL_UNSIGNED_BYTE, bits);


But the image isn´t scaled. Now i found this function to scale images


This function is so slow and my cpu get up to 66%. My question is. How can i scale the image fast and resource friendly to the widget? Is there any OpenGL function that could help me with this problem ?

If it is possible show me a littel example but at moment it is not really easy for me to found the right GL functions.

Thank you really for your help

Best regards


PS: Sorry for my bad english.

can you bind the image you have as a texture, and then just apply it to a screen aligned quad in the resolution you require?

Or I’m not sure if this is exactly helpful but one option might be to render as you are currently doing to a Frame Buffer Object, then render a second pass in the resolution you require, using the result of the first rendering pass as a texture applied to a screen aligned quad.

Hello James,

thank you for your reply. It´s not easy for me to follow your hint, because i´m really new to OpenGL. It is possible that you give me a littel example code.

Then its a little bit easier to understand what you mean.

Best regards


What he means is that instead of copying your image directly to render buffer by glDrawPixels, you should copy it into GPU’s texture memory by calling glTexImage2D and then simply draw a quad covered with this texture - this part will work fast.

First - read some tutorials on texturing and get it working.
Then optimize it:

  1. use glTexImage2D only to create empty 1600x1200 texture during initialization
  2. use glTexSubImage2D to upload new data into this texture

And further optimization (very simple but requires shaders):

  1. use GPU-friendly formats when uploading texture to GPU, like GL_BGR or GL_ABGR.
  2. if images captured from camera are not in these GPU-friendly formats, still send them to GPU using friendly formats but then use fragment shader to swap red and blue components on the fly.

Final optimization (doesn’t increase upload speed but allows more paralellism bertween GPU and CPU if you have use for it):

  1. use pixel buffer object extension to upload images instead of glTexSubImage2D

Thank you for your help. What i have do now is this:

	QMutexLocker lock(&m_mutex);

	if ( !newBit ) {
		GLuint texture;
		// allocate a texture name
		glGenTextures( 1, &texture );

		// select our current texture
		glBindTexture( GL_TEXTURE_2D, texture );
		glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, m_image.width(), m_image.height(),0,GL_LUMINANCE,	GL_UNSIGNED_BYTE, m_image.bits());
		//glDrawPixels(m_image.width(), m_image.height(), GL_LUMINANCE, GL_UNSIGNED_BYTE, m_image.bits());

	glBegin( GL_QUADS );
	glTexCoord2d(0.0,0.0); glVertex2d(0.0,0.0);
	glTexCoord2d(1.0,0.0); glVertex2d(1.0,0.0);
	glTexCoord2d(1.0,1.0); glVertex2d(1.0,1.0);
	glTexCoord2d(0.0,1.0); glVertex2d(0.0,1.0);

	if (!newBit)
	glDeleteTextures( 1, &texture );

I guess that i could so draw my geyscale picture. But it seems that i have forgot some things.

“My problem is that the images i get from the camera have a resolution from 1600 x 1200. My Widget that i use can be dynamicly resize and i have to scale the image to this widget.”
from “camera” i sense a camera in your scene?
fast way to do that and avoid frame buffer objects (for beginners) to have your backbuffer (your rendered scene) to copy directly to a texture using glCopyTexImage2D. this is fast since done from vram to vram with the GPU. after that you have a texture with your scene and draw it again using a fullscreen quad (or lower res quad) with proper texture filtering.

Do you have an example, how can i do that?


// Capture backbuffer into <screenTexture> id

// Switch to ortho! and draw quad with desired size

look up the functions what they do to understand the logic.

1600x1200 may suggest that this is rendered scene, but from description in 1st post I think he meant real camera.

As for your code, Treehouse:


  1. glGenTextures
  2. glBindTexture
  3. glTexImage2D
  4. glTexParameter - set parameter GL_MIN_FILTER to either GL_NEAREST or GL_LINEAR.

In main loop, use:

  1. glBindTexture
  2. glTexSubImage2D
  3. glEnable(GL_TEXTURE_2D)
  4. …render…
  5. glDisable(GL_TEXTURE_2D)

Look at any texturing tutorial - you would find exactly these steps in the code (except for glTexSubImage2D, which you use to dynamically update texture).

OK. So the function that should show the image looks like this:

void CFrame::paintGL(void)
	QMutexLocker lock(&m_mutex);
	if ( !newBit ) {
		glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width(), height(),0,GL_LUMINANCE,	GL_UNSIGNED_BYTE, bits);

		glBindTexture(GL_TEXTURE_2D, texture);                    // Select Texture
		glBegin(GL_QUADS);                                        // Start Drawing A Textured Quad
		glTexCoord2f(0.0f, 0.0f); glVertex3f(-1.1f, -1.1f, 0.0f); // Bottom Left
		glTexCoord2f(3.0f, 0.0f); glVertex3f( 1.1f, -1.1f, 0.0f); // Bottom Right
		glTexCoord2f(3.0f, 3.0f); glVertex3f( 1.1f, 1.1f, 0.0f);  // Top Right
		glTexCoord2f(0.0f, 3.0f); glVertex3f(-1.1f, 1.1f, 0.0f);  // Top Left
		glEnd();                                                          // Done Drawing The Quad

But the only thing i see is a white plain picture. What did i wrong?

One thing you did wrong is that you didn’t pay attention to what I wrote in my last post :wink: