How to effectively resize framebuffer image ?

Hi.
I need to effectively stream out the framebuffer (images) to net.
The size(wxh) and format of streamed image can be various according to the request from net.
I think that the best way how to do that is prepare the output image in graphic card and then read out it to host ram and send it directly to net.
I’m using Framebuffer object(FBO) to render the framebuffer to the texture.

So I think about it this way:

  1. Create output texture that will hold the final image for streaming according to requested widthxheight and pixel format
  2. Set glPixelZoom to rescale image from framebuffer size to output size.
  3. Use glCopyTexImages2dPixels to transfer Framebuffer to output texture.
  4. Read out output texture with glGetTexImage to host memory.
    (Maybe use PBO to allow asynchronous read out of output image).

Is there any other way how to do that task ?

glPixelZoom: done in software.
use FrameBufferBlit or shaders to scale-down; PBOs to transfer to RAM. glGenerateMipmaps can be a quick and blocky way to downscale from arbitrary size, to a set of mipmaps.