# displaying field values on a surface

I display 3D surfaces ( a CAD file) in my viewer.

Now I want to add a 4th information to XYZ in my view by adding a color on the surface corresponding to this 4th fields.

The 4th fields can be “Temperature of the part that is collected by sensors” . Color will go from blue for cool to red for warm, I want to fill the surface by interpolating temperature values for area without sensors

at the end I would like to get something similar to http://brain.wireos.com/wp-content/uploads/x30_aero10.PNG

Thank you

Simple : 1D texturing.
Build a 1D gradient texture that goes smoothly from blue for cool to red for warm.
Then glTexCoord1f(temp) for each vertex (with temp being normalized between 0.0 coolest to 1.0 warmest).

Yup, also on a mesh if you don’t have a sensor per vertex you interpolate per vertex what the sensor value is. This means you’re doing per vertex interpolation of sensor readings (texture coordinates).

The efficient way to do this is a unique use of skinning.

You’re actually skinning for temperature (texture coordinate)where the mesh geometry and transform is fixed but each bone represents a sensor. The bone weights summing to 1 represent sensor temperature interpolation accross the mesh.

It’s a beautiful thing.

You’d want to use shaders to do this I think, and it’s a bit advanced, but once setup you could simply supply per sensor (bone) temperature data and the vertex shader would interpolate the temperature over the mesh and then use the texture coordinate to fetch from zbuffer’s 1D gradient

Thank you ZbuffeR

but my problem is the sensors positions do not fit with triangles vertex and may move from one time to another

Bad news, all the solution that came in my mind are not easy to implement.
Solution 1:
Render the color on a texture
1- Find the triangle under sensor
2- Compute the UV of the point under the sensor
3- Render to texture of a grid with the (1D texture|vertex color) technique explained before.
4- Apply the texture to the mesh
It can be quite slow for a big number of sensor, mesh need a non overlapping UV.
Solution 2:
Tessellate (on CPU) the mesh to match the position of the sensors.
For each triangle:
1- Find how many sensor are over the triangle
2- Add vertex to the meshes based on the number of sensor
3- apply ZBuffer technique
Very slow but you can do that only when someone move a sensor or at load time if nobody move the sensor during the simulation. I can became quite tricky on curved mesh,
Solution 3: export the mesh with a lot of vertexes and use Dorbie technique.
Very easy but use more memory.

Are these sensors physically located on the object? If so, you could remodel the surface using the sensors and the CAD vertices to define a mesh. Coloring the surface could then be done with simple color interpolation. Lighting and textures would not be necessary.

If the sensor is located off the object and is taking readings from a point in space you might be able to project a texture onto the CAD surface.

Cochise3D : then take dorbie advice
Without shaders, you will have to interpolate sensor data yourself so that each vertex has data.

EDIT: sorry, a bit late

Well my advice still falls short now since the new information suggests that there are insufficient vertices.

What’s needed in this case is a dependent read with one texture being an index mapping sensors locations to their values and then to the 1D ramp.

Unfortunately to get temperature interpolation you need post LUT filtering you need to perform nearest sampling in the texture fetch and weight the dependent fetches accordingly.

Yikes. It gets really ugly because you’re trying to map a spatial component to a sensor index, then map the result temperature for that index them colorize.

Frankly it’d be easier to just add the required vertices. Or prebake the results to a texture image skinned to the surface, especially if you don’t need dynamic sensor data results.

My advice is to skin the wing with a 2D texture image. Plot the sensor locations and their temperatures in texture image space and in the image interpolate these points using some kind of Delauney/Voronoi/diffusion algorithm. This is simpler than it sounds really, but if the data changes over time you will need to animate this whole process.

Keep it simple and you can try to get fancy later when you have the basic mapping working.

You could keep the interpolated image high precision monochrome, perhaps a HALF format then dependent read and still use zbuffer’s ramp allowing you to edit the color ramp, adjust range and look at isothermal lines etc.