glColor4d ignores alpha

Hi there. I’m totally stumped. glColor4d seems to totally ignore alpha, and always uses 1.


		glColor4d(r, g, b, a); // <-- no matter what I insert for a (ranging from 0 to 1), it gets ignored (r,g,b work as expected)
		glTexCoord2f(0, 0); glVertex2d(-1, -1);
		glTexCoord2f(1, 0); glVertex2d(+1, -1);
		glTexCoord2f(1, 1); glVertex2d(+1, +1);
		glTexCoord2f(0, 1); glVertex2d(-1, +1);

Help? As you can see I’m not even trying to use textures yet… which shouldn’t matter, right? Setting a different color for each corner works fine, just the alpha gets ignored.

I googled a LOT, and I tried a lot of things… alas, nothing. The best I can come up with is “this should work, it just doesn’t”. I’m using SDL by the way, but so far that doesn’t mean more than init, setup screen/window and open the OpenGL viewport etc.

My GFX card is not a gaming beast, but not that old either, and I never notice any issues in apps besides the one I am writing, so I assume it’s not a driver thing.

Hmm, I got it working now by stripping down to the bare bones… though I don’t know what the mistake was… sorry for the useless topic :o