FrameBuffer to texture

I know that with sgi you can put the framebuffer directly to a texture using the pbuffer. This alows you to bypass the slow copy pixel that travels across the agp bus into processor memory and then back to a texture.

Has anyone read of a way to do this with opengl, opengl extensions, or an nvidia extension?

The reason I want to map the framebuffer to a texture is to so I can do reflextions (or shadows) by rendering an object from a different viewpoint and then mapping that to a texture. Having to go out to processor memory will just kill performance. Any help would be great. Thank!!!

AFAIK all glX implementations (IE on Linux) have to provide a pbuffer. Could be wrong though.

NVidia have started optimising their drivers (this from a Geforce FAQ)

Writing to the depth buffer via glDrawPixels is quite slow (though reading the depth buffer via
glReadPixels is moderately fast).


Tried glCopyTexImage2D?

“The glCopyTexImage2D function copies pixels from the frame buffer into a two-dimensional texture image.”

Thanks Relic that is EXACTLY what I was looking for.

Sorta sad that it’s a standard gl call and I didn’t remember it. Thanks a ton.