In a real-time image system, To scale image cost lots of CPU circles, for every image frame, a resample process needed.
So , my question is:
could the image-scale done by GPU, if so , please show a little bit guides on how to do it with openGL?
To resize a picture or image from original size to another size without obvious information lost. when we ajust the replayer/window-media-player 's window size, we do a image scale.
It all depends on what you want to do with it.
If all you’re interested in is displaying a (relatively static) picture, you only need to render one textured quad:
Upload texture, specify minification/magnification options (texture filtering) - you’ll probably want to use GL_LINEAR.
When refreshing the screen, render a textured quad.
If you want to retrieve the results, you can use glReadPixels, but it’s a slow operation.
If you want to get more advanced filtering (more advanced than simple linear interpolation), you might be able to use a fragment program for it, but it’s more complex to get that to work.
In fact, I want to scale an real-time video image(25Frames/second), so I need so fast scale operation. could texture quad work?
In principle: yes. One textured quad won’t be a problem at all.
But it all depends on how fast textures can be read in and uploaded to the card.
You’ll probably not want to create mipmaps, since those cost performance to create.
I have tried , create and read a 1024*512 2D_texture cost 5ms on my PC.
But I dont know how to scale the 2D_texture and dispaly it on the screen.
hope for your guide.