Using the reflection vector in Spherical Environment Mapping

When I use Spherical Environment Mapping I want to get a UV coords from a vector.

OpenGL uses it’s own function to calculate
the UV coords for every vertex, given it’s normal. The function can be found in MSDN under glTexGen.

If I want openGL to calculate the UV using the Reflection Vector, so I could use it for Reflection, can I do that? (OpenGL’s function that used to claculate the UV is not appropriate for me).

No solution from me, but maybe a way around the problem.

You don’t say what you need, or in what way it’s not rppropriate. But if you think that sphere mapping is, how to say, not powerfull enough, then maybe you can look in to the cube enironment mapping extension. This one is a very acurate reflection method, if it’s what you need, I really recomend using it (if it’s available of course).

Or maybe you can do the maths yourself. You said you found the formula in the MSVC6 Docs, and I have seen it myself. It’s not too complex, so why not try to do it yourself? This, on the other hand, requires that you do have the modelview matrix somewhere. Either get it directly from OpenGL, or use your own transformation code, that holds a matrix for you, and when doing transformatins, you transform your own matrix aswell as sending the transformation to OpenGL.

Well, I know about using the Cubic Environment mapping extention, I just wanted to make sure that there is no way to do the sphere mapping in hardware using the reflection vector…

thanks anyway.

OpenGL uses the normal you give it to calculate the reflection vector. If you don’t like the reflection vector it uses, then just change the normal you send it. Easy, no? You just have to find an algorithm that maps your function onto the function OpenGL uses. (Hint: the inverse function of the function that OpenGL uses may prove useful).

[This message has been edited by DFrey (edited 01-08-2001).]

For what it’s worth, we do support the SPHERE_MAP texgen in hardware.

Not that that necessarily helps in this case.

  • Matt

No, it’s not easy, DFrey.
A. OGL use the normal to calculate the lighting. I can’t just change the normal as I want.
B. I will have to apply the mapping function to every normal. That can be to expensive.

I know I can use two passes, one for the lighting and one for the texture but it will be too costly…

There is absolutely no rule that says the normals you give OpenGL must be the real vertex normals. So, yes, you can change the “normals” anytime you please.

I know that… But OpenGL will use those normals for calulating the light unless I use TWO passes…

With multitexturing you could light it in one pass. It would be expensive but it can be done. On the other hand, most modern hardware could easily handle two passes for it. Remember, many modern games for example have no problem using 4 or more passes for some parts, without appreciable slowdown.

[This message has been edited by DFrey (edited 01-09-2001).]

I would suggest using NV_vertex_program when it becomes available.

Wanting to do your own texgen is one really good example use of NV_vertex_program. In the interim, you can just do the calculation yourself and specify the texture coordinates directly.


Where can I read about it, Cass?

Right here at NVIDIA’S developer page

[This message has been edited by DFrey (edited 01-09-2001).]

I don’t think you can do it with multitexturing…

Yes you can, but again, you wouldn’t want to as it involves creating a (potentially) dynamic lightmap.