Im using automatic texture coords, but now I want to plug this into a vertex/fragment program. I suppose that the S and T vectors which I use for the tex coord generation along with the SxT vector should do well for the appropriate texture transform. Am I correct in this assumption?

If I remember correctly, T/L is disabled when using vertex shader, so you’ll have to do your texgen on your own.

Yes, but the question is, do I have to do some funky magic, or are these S and T vectors the S and T vectors that I need?

state.texgen[n].eye.s would be one texgen plane, as you specified somewhere in your application.

You then use that to create the texture coordinate yourself.

shadow coordinate generation would be something like this:

#texgen emulation

DP4 temp.x, state.texgen[n].eye.s, vertPos

DP4 temp.y, state.texgen[n].eye.t, vertPos

DP4 temp.z, state.texgen[n].eye.r, vertPos

DP4 temp.w, state.texgen[n].eye.q, vertPos

#texturematrix multiply

DP4 coord.x, textureMatrix[0], temp

DP4 coord.y, textureMatrix[1], temp

DP4 coord.z, textureMatrix[2], temp

DP4 coord.w, textureMatrix[3], temp

vertPos is the vertex position mutliplied by the modelview matrix.

Hope that gives you an idea of what you need to do. As usual check nvidia’s site for texgen docs. They’ve done a wonderful job putting papers online that explain the math so you can write your vertex programs.

[This message has been edited by titan (edited 05-11-2003).]