Reflective texture mapping


I am currently using a cube map with GL_REFLECTION_MAP_EXT texture coordinate generation to render reflections of the sky and sun on a water surface. Here is a screenshot:

The cube map works for reflecting objects that are infinitely far away, i.e. the skybox, but does not work for local objects, such as my terrain. I would like to change my method of drawing reflections so that I can have reflections of local objects too.

In order to this, I intend to render the scene flipped about the reflection plane into a texture, and then map that texture to my water surface. I have a few questions regarding generation of texture coordinates for drawing such a reflection texture.

  1. Is there a way to automatically generate the texture coordinates, similar to using GL_REFLECTION_MAP_EXT, but not for cube maps.

  2. If the answer to #1 is no, would there be a way to render the reflection into one face of the cube map, and use texture matrix tricks to make GL_REFLECTION_MAP_EXT still work?

  3. If the answers to number #1 and #2 are no, what is the formula I should use to generate texture coordinates manually on the cpu? It must take into account the vertex normals.

Any help would be greatly appreciated. Thanks in advance.

Yes, you if render a texture that looks like the reflection in screen space, then you can use EYE_LINEAR texgen (or OBJECT_LINEAR, if you transform the axes suitably).

Thanks for your reply. However, EYE_LINEAR and OBJECT_LINEAR do not take into account the vertex normals. The vertex normals are what give you the distorted, watery reflection, like in the cube map screenshot I showed. If I wanted a flat reflection, I wouldnt need to use render to texture at all, because I could just flip the geometry and draw it directly over the water with the water area stenciled.

I dont think there is anything in opengl that will generate the texture coordinates automatically how I need them. My question then is, how can I calculate them myself?

[This message has been edited by ioquan (edited 10-05-2003).]

I suggest plugging a peturbation into the texture coordinates, based on the normal dot forward vector and normal dot sideways vector.

This would be simple to do using ARB_vertex_program, but you could also do it manually and just send in pre-cooked texture coordinates. As long as your water is low in vertex count, the additional CPU hit should be negligible.

Thanks for your reply.

Yes, I was thinking of doing this with a vertex program. It would be a good way for me to get started with vertex programs. However, I still want to have a version that does it on the cpu, for those people whose hardware doesnt support vertex programs.

The math is the same either way, and I’m not exactly sure what it should be. In your suggestion, do the forward and right vectors you mentioned refer to the forward and right vectors of the camera? Would those vectors and the normal vector all need to be in world coordinates? Once you took the dot product of the vectors, how exactly would you use it to distort the texture coordinates?

[This message has been edited by ioquan (edited 10-05-2003).]