I’m writing shaders in an application called Quartz Composer, and trying to get a shader working that uses Normal and Tangent vectors passed in to the shader from the containing application. Unfortunately, Quartz Composer only passes the Normal. Is it possible to calculate the Tangent vector from the Normal in the vertex shader? Alternatively, is it possible to approximate the Tangent in some other way, purely inside the Shader program?

There are ways. If you’re planning on using it for normal mapping, then I think dFdx/dFdy on the texture coordinates is the best bet.

Another way is as follows: Find the component of the normal with the highest absolute value (say the y component). Set the other components to zero. Cross this new vector with the normal and normalize to get one of the tangent vectors. Cross the new tangent and the normal and normalize to get the second tangent vector.

However since the choice of component to use will affect the tangents, it might lead to some visual artifacts.

thanks for getting back to me so quickly!
I’ll look into this method.

I’m afraid I don’t have the option of pre-calculating the Tangent in the application, as I’m not making an application from scratch, but using a pre-built application that allows the use of GLSL shaders.

thanks for the rapid response!
I’m not intending to use the Tangent for bump-mapping on this occasion (though in the future I may do), so your alternative method looks like it might be the one to go for. Actually, since I need the Tangent in the Vertex shader to do vertex-displacement, this is definitely the way to go. I’m not sure how accurate the Tangent estimate has to be, but I’ll give your method a go.

Do you happen to have a code-snippet to do the relevant calculations? I guess you’d use nested max functions to work out which vector of the Normal was larger… Probably a very silly question, but what should I use for the 3rd Tangent vector?

I must admit I’ve never implemented it in GLSL. The pseudocode would look something like

max = abs(normal.x)
idx = 0
if (abs(normal.y) > max)
max = abs(normal.y)
idx = 1
if (abs(normal.z) > max)
max = abs(normal.z)
idx = 2
v = (0, 0, 0)
v[idx] = normal[idx]
tangent1 = normalize(cross(normal, v))
tangent2 = normalize(cross(normal, tangent1))

I guess in GLSL you could do something like

vec3 absnormal = abs(normal);
float max = max(absnormal.x, max(absnormal.y, absnormal.z));
vec3 v = normal * step(max, absnormal);
vec3 tangent1 = normalize(cross(normal, v));
vec3 tangent2 = normalize(cross(normal, tangent1));

Didn’t actually test this code so… but shouldn’t be far off I think.

To be honest, I can’t remember if “v[idx] = normal[idx]” is strictly necessary or if you could simply use “v[idx] = 1”. It would affect handedness, but I can’t remember how critical that is. If it’s not, then you could simply use “vec3 v = step(max, absnormal);”

I guess Tangent1 in your code is the Tangent vector
and Tangent2 is the BiTangent vector…

I don’t actually need the BiTangent for the particular code I’m working on, but it’s good to know how to calculate it, anyway.

Just out of interest, why is it not possible to use Tangent/BiTangent vectors calculated this way to shift texture coords or do bumpmapping? Is it because they won’t be properly interpolated between vertices if passed as varyings to the Fragment Shader?

Toneburst… tangent and binormal (or bitangent) vectors depend on texture mapping. They are basically the direction of s and t texture axes on the surface (that is why the tangent space is more corectly called “texture space”). Actually, you can take any vectors, as long as they lie on the triangle and define a ortonormal base — this is what Lord_crc’s code does. The problem: when you use this for thing like bumpmapping, it will distort your results, as you will get skewed vectors.

And yes, the choice of major axis (the computation of the v vector in the above code) may change between the vertices of a single triangle, which makes the tangent change in a discontinuous way.

Although there are other (similar) ways to compute a tangent without additional information there’s no way to not make it either break down or make it discontinuous (afaik).

edit: and as Zengar mentioned, for bump mapping etc, you’ll want to be able to change the normal along the axes defined by the texture.

I think that it is just necessary to have a vector that is not orthogonal or collinear to the normal to compute the cross product correctly.

So you can cross product the normal with one of the space base vector: (1, 0, 0), (0, 1, 0) or (0, 0, 1) while the noraml is not othogonal or collinear to the chosen vector.

just to get things clear in my mind, is it safer to calculate the Tangent and BiTangent using Lord crc’s suggested method, or just to cross the Normal with
(1.0,0.0,0.0),
(0.0,1.0,0.0) or
(0.0,0.0,1.0)
?

Is one of the above vecs preferable for calculating the Tangent and BiTangent?

Lord crc method is the same as the cross between (1, 0, 0), (0, 1, 0), or (0, 0, 1) with normal, one.

You just have to take care that the vector that you choose is not orthogonal or collinear to the normal like I said before.

For example, if you choose: (1, 0, 0) you have to check if normal.x is not 0 or near 0 like 1e-6.
If the check fail you have to choose the next vector: (0, 1, 0), then do the same verification.

This method create an arbitrary tangent vector or binormal vector. We are just sure that the vector generated is in the tangent space.

The usability of this kind of tangent space is very restrained because it is not the texture space created from the texture coordinates derivation.

So the Tangent and BiNormal should run in the same direction as the texture U and V coordinates to be properly usable for bump-mapping, for example? So, this can only be done properly by comparing the normal at more than one vertex, hence you can’t do it a vertex shader? Or am I completely wrong?

Sorry for the silly questions, just wrestling with the concepts here…

So, this can only be done properly by comparing the normal at more than one vertex, hence you can’t do it a vertex shader? Or am I completely wrong?

No you can’t do it in a vertex shader but you can in a fragment shader by derivating texture coordinates with dFdx/dFdy. Another option is to do that in a geometry shader but I have never tried and only last-gen graphic card support it.

I see. Thanks very much for clarifying that for me dletozeun.

I’m really interested in Geometry shaders, but with the tools I have at my disposal, I can’t currently use them. Maybe by the time my knowledge has progressed to the point where I could do something useful with them they’ll be supported by Quartz Composer

I’ll definitely look into the dFdx/dFdy method at some point though. I’ve never really been able to get bump/normal mapping to work properly, so this is probably the way to go in future.