color mapping shader card-specific bug

I have a fragment shader (below) that fails on some cards.

It works well on Quadro 3800 and Quadro 4600 but fails on Quadro 5500 and the GTX 400. When it fails it looks up the wrong color, like it’s looking in the wrong location. But it still runs with no errors. I suspect I am doing something wrong and it just happens to work on certain cards.

Below the function getSuperpixelColor() does two lookups to get the color:

  1. Lookup in indexTexture at the current uv
  2. Take that RGBA value, treat it as a 24-bit value and lookup in the colorMap texture. This texture is (256 x colormapRows).

The unpack stuff with the dot product is adapted from code I found online. I’m open to doing this a totally different way, if my way is just not smart. Basically we have one indexTexture of RGBA values which I want to interpret was indexes into another colorMap texture, and look up them up.

We also have a grayscale texture and blend this color value with the grayscale, but that is not shown.

Originally I had 16-bit values and used GL_LUMINANCE16. When we needed 24-bit I switched to RGBA. Not sure if some other texture format would make things easier or faster?

Thanks!


// unpack a vec3 as a single float
// we expect 24-bit RGB values only, no alpha
float unpackVec3(vec3 v)
{
    const vec3 unpackV = vec3(
        255,
        255 * (256),
        255 * (256 * 256));

    return dot(v.rgb, unpackV);
}

// Table is assumed to be 256 entries wide and colormapRows long.
vec2 getTableCoords(vec3 v)
{
    float f = unpackVec3(v);
    return vec2(mod(f, 256.0) / 255.0, floor(f / 256.0) / (colormapRows - 1.0));
}

vec4 getSuperpixelColor(vec2 uv)
{
    // Superpixel index, used in lookup table
    vec4 superpixelIndex = texture2D(indexTexture, uv);

    // Get coordinates where we can find the superpixel's color
    vec2 coord = getTableCoords(superpixelIndex.rgb);

    // Return the superpixel color
    return vec4(texture2D(colormapTexture, coord.st));
}

I simplified the shader which improved things somewhat, but some error still remains. And I assumed 256x256 colormap to simplify further for testing purposes.

I did a checkboard test. My indexTexture has every possible 16-bit index red=[0…255] and blue[0.255]. Then my colormapTexture is a regular checkboard.

So I expect to see the checkerboard drawn exactly. On the Quadro 5500 there are two glitches at u/v ~0.95 and 1.0. On the Quadro 3800 it works with zero errors.

Any idea what could be causing the difference in cards?

I could expand my colormap to have little 2x2 squares instead of single pixel color values, but that seems wasteful.

Here is the result from the 5500:
http://i.imgur.com/0Lofj.png

Here is now trivial shader code:


vec2 getTableCoords(vec3 color)
{
   return vec2(color.r, color.g);
}

vec4 getSuperpixelColor(vec2 uv)
{
    // Superpixel index, used in lookup table
    vec4 superpixelIndex = texture2D(indexTexture, uv);

    // Get coordinates where we can find the superpixel's color
    vec2 coord = getTableCoords(superpixelIndex.rgb);

    // Return the superpixel color
    return vec4(texture2D(colormapTexture, coord.st));
}

We did not find a solution to this problem. We just had to stop using the Quadro 5500 cards. Quadro 3800, 4800 and GeForce 480 and even MacBooks work fine.

We also rewrote things to use a 3D Texture and texelFetch where available, which simplified the shader while allowing any size colormap up to full 24-bit (256x256x256).