Hi, i have been looking into normal mapping, and with geometry it seems that the tangent and binormal are passed into hardware and computed from the mesh.

I was wandering if it is possible to compute the tangent and binormal on the fly in some way.

For example, in volume rendering, i would like to address a normal map, but only have object space position and normal vector.

My math is not great, is this at all possible, perhaps with partial derivitives ?

From looking at the code it seems that the three vertices of a triangle are used to arrive at the tangent vector.

I dont have a triangle mesh of any sort, instead just a scalar field. Would it still be possible to get a tangent vector from simply one normal and position in some manner?

in volume rendering, i would like to address a normal map
In volume rendering normal map is a 3-d texture that defines normal vector at each voxel. So you donâ€™t need any tangents or even per-vertex normals.
Tangents are only needed if you want to apply 2-d normalmap to surface in such way that texture coordinates will change directions.
So if you need tangents for volume rendering then I assume you want to apply 2-dimensional normal map to 3-dimensional volume. Is that right?

Yes, this is correct. i already have methods such as iso-surfacing with lighting working for volumes with a normal worked out with central differences.

Ideally i would like to port existing techniques such as normal mapping into the volume pipeline.

My current technique uses a proxy object representation to derive the tangent space, however it would be better to derive a better tangent direction on the fly, im just not sure if this is possible with the information at hand