Vertex Shader generating 2 output vertices?

I’m writing a 3d texture renderer supporting volumetric lighting.
It’s running using multitexturing and I’m doing a lot on the CPU right now, but now I’d like to accelerate it by moving most code into vertex and fragment shaders.
But I know nothing about shaders yet (well, I saw a few examples). So I’m here to ask what is possible and what not. Maybe you just can point me to the right references / books.
I’m mainly developing for nvidia/cg, it’s a semester thesis.

I have a 3d object = volume which must be rendered. Therefore I slice this object into a lot 2d slices:
For each of these slices:
1st passt: render the slice for the eye into the framebuffer
– for this render pass, use a 3d texure as transfer function for the volumetric model
– and use a 2d texure which is generated in the previous 2nd pass (combine the 2d texture with the 3d texture and blend them into the eye buffer)

2nd pass: render the same slice for the light into the lightbuffer
– again use the 3d texture and blend into the light buffer
– render directly into the light buffer, e.g. with the frame_buffer object extension

end for each slice;

So right now I’m using multitexturing for the texture lookups and combinations (modulation) and glBlendFunc for the blending etc.
I’ll replace this with fragment shaders.

For the first pass I need another fragment shader than for the second pass.

And in the first pass, I already need the coordinates for the volumetric lighting texture since I’m using the texture to modulate the 3d volume data.

So I thought of a a single vertex shader that has some inputs and computes the vertex coordinates for both passes and outputs 2 vertices.
1 output vertex for the first pass and 1 output vertex for the 2nd pass.
Maybe the vertex shader could also define which output vertex will use which fragment shader.

Why do I want to combine the 1st and the 2nd pass?
Because the lighting texture coordinates that must be computed for the 1st pass are the same as the vertex coordinates for the light buffer. So I could save some computations there.


  1. Can a vertex shader output two vertices?
  2. Can 2 different fragment shaders be loaded at the same time? And how can you select them? Can you select in the vertex shader the fragment shader that should be used?
  3. Can a fragment shader select to which render target / framebuffer it outputs?
  4. Is there something like “streams” or logically parallel pipelines? I’d like to have loaded 2 different vertex shaders and 2 different fragment shaders at the same time, such that I don’t have to switch the binding x times etc. (2 pass algorithm for each volumetric slice)
  5. Where could I find such information? (books, guides, references?)

Thanks - Andy

1 Like

1: No.

2: No.

3: A fragment program can render to multiple outputs, if the hardware exposes the support for it. I forget if the extension was built into 2.0 or not. However, I’m not sure if it is syntactically OK or not to not write to a particular buffer (that is, to conditionally write to a buffer).

4: No.

  1. as Korval said this is not possible. A vertex shader operates on exactly one vertex, it cannot add or delete vertices. This will be able with the geometry shader that hopefully comes up soon.

  2. No, you can only have on shader loaded at the same time. But you can use a hack with an if statement:
    if (use_shader1)

    else if (use_shader2)

    and so on. Then you have to pass these values via vertex attributes.

  3. You can use ARB_draw_buffer extension for rendering to multiple targets (textures, …)

  4. No, you have to switch shader for somethine like that. But you can use the hack from point 2.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.