Using textures as render target

The first line comment in the wiki for Renderbuffer objects states:

Renderbuffer Objects are OpenGL Objects that contain images. They are created and used specifically with Framebuffer Objects. They are optimized for being used as render targets, while Textures may not be.

When doing deferred shading with frame buffer objects, it is common to set up a texture target with glFramebufferTexture2D() instead of using a render buffer (glFramebufferRenderbuffer). Is this not optimal? If so, is there a better way to produce the data needed for the deferred stage?

The main difference between the two is that you use textures if you want to use the buffer later as a texture, while the render buffer object is when you don’t need to save it, for instance i normally use a render buffer for the depth buffer and a texture for the color buffer when that’s all i need.

I think the thing confusing you is the line “Textures may not be.”
The word “may” is pretty important here as i am pretty sure they are pretty much the same in most cases, though someone more familiar with the internal workings of openGL drivers could answer this question in more detail.

Though in the case of Deferred rendering you pretty much need to save it all, so textures are used for all render targets.

You can also use render buffer objects to glReadPixels if you need to do that.

From what I understand render buffers and texture buffers have different memory layouts. Render buffers are optimised for screen display/rendering speed, while textures are optimised for use as input to pixel shader caches etc.

Accoding to OpenGL documentation, glFramebufferTexture2D is available from OpenGL 3.2. How do you do deferred shading if you can’t use glFramebufferTexture2D?

Only glFramebufferRenderbuffer would be available, and you don’t get your data into texture buffers?

All the framebuffer object related features that are necessary to implement deferred shading are exposed by the extension GL_EXT_framebuffer_object that is available on all GL2+ drivers.

I changed my depth buffer from a render buffer to a texture buffer, with no measurable difference in performance. Of course, it may be that other graphic cards will show something else.