I have a scene which is normally rendered at a frame rate of approx. 70 fps, but when I mask a color with glColorMask, I get a frame rate of only 2 fps…! (but the color is masked as expected)
Does anyone have an idea what is going on??
My code looks like this:
Thanks for your advice
Maybe you are getting software rendering.
You mean this function is not hardware-implemented? But I guess this should be just too easy to implement…
Any alternative solution you recomend?
Yes, I’m saying that it may not be implemnted in hardware.
When I had a TNT, I tried to mask out a specific color channel, and the rendering time was quite high. Have not tried it on my GF2 though, so can’t say how it works there.
You can simulate the effect with texture combiners. This requires that you have a free texture unit. Setup everything as usual, and then enable yet another texture unit. Then, using GL_ARB_texture_env_combine, set the function to GL_MODULATE, arg0 to GL_PREVIOUS, and arg1 to GL_CONSTANT. Then the constan tolot will me your mask.
I’m not sure, but I think you set the constant color with glBlendColor. glBlendColor(1.0, 1.0, 0.0, 1.0) will then remove the blue channel, which will be zero.
getting 2fps sounds like software rendering.
use glGetString( GL_VENDER ) to see if u are.
glColorMask(true,true,true,true) is the default mode btw.
Using glGetString to detect software rendering dynamically is not a good idea, since it only returns information about the driver/hardware, and does not mention anything about software/hardware rendering. You can only use the information to see if you use MS’s implementation, and if you do, you have software rendering. Anything else can be either software or hardware. Since he gets 70 fps without changes to the color mask, I’m pretty sure he’s not using MS’s implementation.
Thanks for your help guys.
(btw my graphic card is an Intel 82815, I guess nothing special )