Alpha & composition

I am trying to build a scene in which a bunch of objects sitting on a plane are reflected in it (actually multiple planes may be spaced vertically, but that is not the problem).

Right now, I am rendering the reflected objects, a blended semi-transparent floor, and the real objects. Additionally I’m using stencils to limit the reflected objects to be reflected in floor pixels, and do draw projected shadow polygons pixels exactly once on the floor.

The problem I have is in how I want to texture the floor. All would be well if it were a simple texture projected across the floor, however this is actually a potentially large diagram made out of many overlapping images/decals.

Because of the size of the diagram, it is impractical to pre-render it as a texture (my decals might be 256x256 - which I want for close viewing - and at that size, the fully rendered plane texture dimensions might be 50 times that in either dimension)

The crux of my problem, is that I need to render the whole floor with a non 1.0 alpha. Obviously I have a compositing problem if I want to have opaque decals covering each other on the floor, but the floor itself remaining semi-transparent. I figure I could use a stencil or polygon-offset to draw each floor pixel only once (from back to front), but that then means I can’t have decal textures with meaningful alpha values (which I’d like for text).

I guess my ideal would be to render the floor fully opaque only scene to an off screen buffer, and then composite that into the real scene (or use it as a texture without perspective correction)

Being new to OpenGL, I don’t know if I’m missing something, but it seems like it will be hard to do this whilst drawing the correct stencil and depth buffer pixels while drawing the plane to the real display - ideally i’d like to do perspective correct interpolation on everything but the texture.

Perhaps there is an easier way… I’m hoping!

perhaps I’ve answered my own question, but I’d still appreciate any other suggestions.

Presumably I can draw a transparent floor quad to the real scene writing a stencil buffer as well as depth buffer.

I can then just draw the pre-rasterized floor to the display buffer using a single non-perspective correct texture according to the stencil but without updating the stencil buffers or depth buffers (or perhaps i can copy pixles one for one only affecting pixels according to the stencil mask, and including alpha blending - I’ll have to check)

This’ll work as long as I don’t have any other semi-transparent stuff visible in front of the floor (which actually I will, but I can fix that by rendering the scene in the correct order).

I would probably render the reflection into a different FBO and then post them together later.
If you overdraw the reflection with black semi transparent polygons in the same order as you do the other render buffer, you could probably get all the reflections in the same image.

Actually I realized I can simplify (I’m not sure I have time to figure out FBO yet)…

The only reason my floor is not opaque is so that I can see the reflection (inverted polygons rendered on the other side) thru it.

Given that there is no other reason for the floor to be transparent, I figure I can

  1. render the floor textures + 1 stencil bit full opaque, but without updating the depth buffer
  2. render the reflected objects over the floor using alpha blending and depth testing (and using stencil test to limit the pixels to the floor)
  3. render the floor again but just updating the depth buffer (so that the floor then plays nice with the rest of the scene