I have two arrays of ‘unsigned char’ data, and I’d like to do bitwise AND operations between them.
If I have arrays of A and B, what I have to do is something like this:
C[i] = A[i] & B[i] ;
Because the arrays are very large and the platform I’m working is a smartphone, I am trying to do the job in GLSL ES.
Here, I encountered two problems.
The texture values retrieved from texture2D(.) are given in float type, instead of ‘unsigned char’, which I want to use. Is there a way to use the unsigned char data as it is in GLSL ES shader ?
There is no bitwise AND operation in GLSL ES, as I’m aware. How could I do ‘AND’ ? Should I implement a function that mimics ‘AND’ manually ?
Unfortunately, there is no any support for integer textures and operations in GLSL ES.
While you could scale the [0,1] values received as a result of a texture fetch to the range 0…255, implementing an operation that would mimic the AND operator without any native integer support would be an overkill and not a good idea from performance point of view.
So I would keep the calculation on the CPU side in your case.
I think the reason why it was dropped from ES2 is because no embedded GPU supports logic operations as they don’t support integer manipulation not just in their shader cores but neither in their blending hardware.
The only reason I can think of why it was in ES1 is because at that time most embedded devices did everything in software anyway, thus keeping such a trivial function in ES was okay.
To be honest, I don’t know how well logic ops were supported earlier on desktop hardware but my guess it that they are fully hardware accelerated only since OpenGL 3 capable hardware that came with integer support.
Maybe you should try arekkusu’s idea using glLogicOp(GL_AND) with ES1 and share your experiences with us. I would be interested whether it is really hardware accelerated or just emulated by the CPU.