Projection via texgen - questions


A while back I implemented projective texturing using texgen, but I always felt that perhaps that i’d misunderstood how it was meant to work. In addition ther appears to be a bug, where if the projected texture has mipmaps, I get parts of the texture outside the projection frustrum, even with clamping. From visual inspection it looks like it occurs at the point where the textured mesh passes across the near plane of the projection. In the sample jpg, the bug is the thick yellow band bottom right - this vanishes if i disable mipmapping. mipmap bug

So any advice on why mipmaps causes the visual artifact? Any way to fix it? I tried using Clamp_to_edge as I think that provided the correct behaviour I wanted.

Secondly i’m interested in comments on the texgen code. I’ve recently read through several similar questions in this forum, but they left me a bit more confused than when I started. One particular element was the repeated comment that you’d only have to update texgen if the modelview changed. But don’t you have to supply the texgen everytime you want to render the projected texture? Sure i['m missing something obvious there, perhaps just misunderstood what was meant. I feel I understand the process of projecting the textures, but not how to represent it in code (or rather using the correct code) i’ll admit I got this example working originally by simply testing out different approaches from code i’d seen online.

Anyway the code is as follows, but I should point out this is not from within an actual game engine, instead its run on the openGL stream (via GLtrace), because i’m adding this functionailty to an exsiting product. (sure not the best way, but it works). Because i’m relying solely on the openGL stream I can’t pass explicit data, so i have to use some creative means of obtaining data. Don’t worry about how I get this data, its not relevant, but i’ll quickly explain. Lets say then that in the 3Dworld I have a projector model, this projector, projects a texture onto whatever it faces. To get the projection matrix and model matrix of this projector model I create a dummy camera and a dummy model. The dummy model is placed at the world origin so that its modelview matrix represents the projector’s model actual position/orientation (inverse? - been a while since i came up with this). I flag this model and store the current projection matrix and modelview matrix for use later.

// Dump current projection matrix into LUT via glexIndex 
glGetFloatv(GL_PROJECTION_MATRIX, glexProjectionMTX[glexIndex]);
// Dump current modelview into LUT 
// The model must be at world origin, this means the modelview matrix
// represents the position and oreitnation of the projection source (camera/light whatever)
glGetFloatv(GL_MODELVIEW_MATRIX,  glexModelMTX[glexIndex]);     

The TexGen code is as follows.
Out of interest is there any specific order requirements to glEnable, glTexGeni, glTexGenfv? I’ve seen the order vary in almost everysample i’ve encountered.
Wether there is or not, is there any standard conversion for the order?

Just interested if i’m fundamentally misunderstanding what i’m meant to be doing, but it just happens to work :wink:
Perhaps the code can be simplified, although I realise that I can store the loadidentity matrix, scale and translate in precalucated matrix and save time there.


// TexGen props set elsewhere
GLfloat			glexPS[4]  = {1, 0, 0, 0};
GLfloat			glexPT[4]  = {0, 1, 0, 0};
GLfloat			glexPR[4]  = {0, 0, 1, 0};
GLfloat			glexPQ[4]  = {0, 0, 0, 1};

// Projective Texturing: World Space Based 

glPushMatrix();   // Later on I pop this back so openGL is in the same state the application thinks its in - just in case

glTranslatef(0.5f, 0.5f, 0.0f);
glScalef(0.5f, 0.5f, 1.0f); 	
glMultMatrixf(glexProjectionMTX[glexIndex]);  // Get projection matrix from stored data

glTexGenfv(GL_S, GL_EYE_PLANE, glexPS);
glTexGenfv(GL_T, GL_EYE_PLANE, glexPT);
glTexGenfv(GL_R, GL_EYE_PLANE, glexPR);
glTexGenfv(GL_Q, GL_EYE_PLANE, glexPQ);

// Fix clamping on ATI  

Interesting issue with the plane. I think what’s happening here is the partial derivatives of s & t that are used to compute the MIP LOD become infinite as you near the half space plane of the projection. The filtering therefore chooses a very low MIP LOD and the filtering of that image from the higher resolution image means that all samples have some non black value. You need to ensure all MIP levels have a black border and the degenerate images are black.

You could use clamp to border and have a border color also but that is inefficient on some hardware.

w.r.t. updating the texgen, you only need to do that if you texgen in object space. You can texgen in eye space and the results will be consistent.

Thanks Dorbie,

Yeah I assumed something like that was happening with the mipmaps (since it vanished without them), but I wasn’t to keen on manipulating the mipmaps to all black or whatever. Although it would solve the immediate issue, what would happen if the projected texture hit a surface some distance away where it would require that level mipmap? I’d lose the projection. Perhaps thats the draw back you have to live with?

I do remeber reading something ages ago about using a 1D texture (2 pixels one black one white) and somehow you could set it up so that white got applied to the frustrum area, and black outside it, but I never quite grasped how it was acchieved.

w.r.t (?)
Sorry i’m not quite following the ‘texgen in eye space being consistent’ - consistent with reagrd to what?