Well my advice still falls short now since the new information suggests that there are insufficient vertices.
What’s needed in this case is a dependent read with one texture being an index mapping sensors locations to their values and then to the 1D ramp.
Unfortunately to get temperature interpolation you need post LUT filtering you need to perform nearest sampling in the texture fetch and weight the dependent fetches accordingly.
Yikes. It gets really ugly because you’re trying to map a spatial component to a sensor index, then map the result temperature for that index them colorize.
Frankly it’d be easier to just add the required vertices. Or prebake the results to a texture image skinned to the surface, especially if you don’t need dynamic sensor data results.
My advice is to skin the wing with a 2D texture image. Plot the sensor locations and their temperatures in texture image space and in the image interpolate these points using some kind of Delauney/Voronoi/diffusion algorithm. This is simpler than it sounds really, but if the data changes over time you will need to animate this whole process.
Keep it simple and you can try to get fancy later when you have the basic mapping working.
You could keep the interpolated image high precision monochrome, perhaps a HALF format then dependent read and still use zbuffer’s ramp allowing you to edit the color ramp, adjust range and look at isothermal lines etc.