Glsl debugging problem

hello everyone,

i have a question about glsl debugging. i know i need to output certain value as a color, but how do you do the conversion between a value and a color?

normally i do this, gl_color=vec4(a/k,0,0,1);

where “a” is the value i want to check
and “k” is some value that is known to be twice as the correct value of “a”

and then i make a screenshot of the rendered result and check if the color value is (127,0,0);

but this is not accurate, because i’m not using all the 3 channels of the color, the outputted value can only be between 0-255.

in a normal c++ program, i do this conversion to make use of all the channels of a color vector:

struct COLORID
{
char r;
char g;
char b;
char a;
};

struct COLORID intToColor(int _in)
{
struct COLORID result;
result.a=0;
result.r=0;
result.g=0;
result.b=0;

result=*((struct COLORID*)(void *)&_in);

return result;

}

but in a shader program, i’m not allowed to use pointer, so how can i do this conversion?

thanks.

Unless the “_in” variable is a 32-bit representation of 8-bit values, then what you’re doing makes no sense.

If this is what you’re doing, then as long as you’re on GL 3.0+ class hardware, then you can use bitwise arithmetic to exact each of the bit values. Then do the appropriate division to convert them into floats (since that’s likely what you’re outputting).

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.