Shaders on alternate rendering targets?

  • Can GLSL vertex and fragment shader programs work when writing to PBuffers? It seems like they don’t in my experiments. What about framebuffers? Do shaders work when writing onto a framebuffer?

  • When writing to the screen, can multiple framebuffer textures be selected into multiple ARB targets during rendering? From my experiments, it seems like they can’t. Though PBuffers could.

  • Can GLSL vertex and fragment shader programs work when writing to PBuffers? It seems like they don’t in my experiments. What about framebuffers? Do shaders work when writing onto a framebuffer?
    Shaders and the render target are (almost) completely orthogonal. Except for what the shader outputs.

That being said, implementations can always be buggy.

When writing to the screen, can multiple framebuffer textures be selected into multiple ARB targets during rendering?
… What does that mean? What is a “framebuffer texture”, and how do you select it into an “ARB target”?

If by “framebuffer texture”, you mean “a regular old texture that I just so happened to use as a render target at some point in the past,” and by “ARB target” you mean “texture”, then yes, you can use as many textures that you previously used as render targets as textures as you like.

Do note however the presence of the word “previously”. As in, “these textures cannot currently be bound to the framebuffer object that is bound for rendering”. Or, at the very least, the mipmap levels you are reading from cannot be selected as rendering surfaces.

Let me explain what I am doing, in case that helps.

I am setting up 3 off-screen textures as rendering targets (at the moment, they are PBuffers). The pixel width and height of these textures are the same as the current view’s pixel dimentions.

I render my scene’s color data into one texture.
I render my scene’s surface normal data into another texture. (this texture is used later as a dot3 normal map)
I render my scene’s “distance from camera point, in camera space coordinates” data to the red channel of the third texture and I render the scene’s specular map (“shininess” amount) to the green channel of the third texture.

Once all that off-screen rendering is done:

  • I set the screen as my rendering target.
  • Then set my vert and frag programs as active.
  • Then I select the 3 textures into ARB texture units 0, 1 and 2.
  • Pass some variables to the shaders’ uniform values.
  • THen I render a quad that is fitted to the dimentions of the view.
  • My fragment shader does the rest.

(BTW, these last few steps are done once per light, with a fourth off-screen texture that contains the lighting info. Including color from the light source, attenuation due to distance to the light source and occlusions from shadows.)

It works fine.
However, I experimented breifly with using framebuffers intstead of pbuffers for my off-screen work and found that I couldn’t select all 3 into ARB units. At least, that is how it seemed… My fragment program was getting 0,0,0,0 for color values when sampling them. When I use pbuffers, it works fine.

Also, I experimented with writing a vert and frag program to use when drawing to the pbuffer target. They didn’t get executed. I was wondering if we were only allowed to use shaders when the rendering target was the screen.

Everything you are doing ought to be possible, so it is either a bug in your code or a bug in the driver.

However, I experimented breifly with using framebuffers intstead of pbuffers for my off-screen work and found that I couldn’t select all 3 into ARB units.
Are you using rectangle textures or NPOT textures? If not, then that’s probably your problems. Otherwise, there’s probably a bug in your FBO code. Make sure that the object’s status is complete before you start rendering with it.

Also, I experimented with writing a vert and frag program to use when drawing to the pbuffer target. They didn’t get executed.
PBuffers are rendering contexts. Which means that you must share your shader/program objects with the pbuffer contexts before you can use them to render in a pbuffer context.

That makes a lot of sense, thank you.

Which do you think is better speedwise: Pbuffers or framebuffers?

Framebuffers are a newer technology, correct? I am concerned that fewer cards will be able to support framebuffers than will be able to support pbuffers.

PBuffers are outdated. Use FBOs. Every card, that supports render-to-texture also supports FBOs. And if a driver has buggy FBO support, you can assume its PBuffer support is even worse.

Jan.

Thanks Jan. I think I will take another crack at porting the code over to use FBOs.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.