Multitexturing on 3dfx cards

I want to use multitexturing, where one texture contains the alpha values, and the other contains the texture itself, so I setup the texture units like this:

(*(glextensions->glActiveTextureARB)((GL_TEXTURE0_ARB);
glEnable(GL_TEXTURE_2D);
glMatrixMode(GL_TEXTURE);
glLoadIdentity();
glBindTexture(GL_TEXTURE_2D, id1);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_BLEND);

(*(glextensions->glActiveTextureARB))(GL_TEXTURE1_ARB);
glEnable(GL_TEXTURE_2D);
glMatrixMode(GL_TEXTURE);
glLoadIdentity();
glBindTexture(GL_TEXTURE_2D, id2);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glColor(1,1,1,1);

this works fine on my Geforce card, but on my friends Voodoo3 it just produces totally black polygons, am I doing something wrong that just happens to work on my card, or is the Voodoo3 having serious problems with operations like this?

I should perhaps point out that id1 is a ALPHA texture loaded with
glTexImage2D(GL_TEXTURE_2D, 0, 4, dim, dim, 0, GL_ALPHA , GL_UNSIGNED_BYTE, buf);
and id2 is a RGB texture loaded with
glTexImage2D(GL_TEXTURE_2D, 0, 3, dim, dim, 0, GL_RGB , GL_UNSIGNED_BYTE, buf);

I might be missing the jist of what you’re trying to do but I’m not understanding why you want to use multitexturing to do alpha blending. You could either your a Targa with an Alpha channel or you can build the alpha values into your texture bitmap at run-time. Both of these approaches are covered at NeHe’s site: nehe.gamedev.net under the tutorials section.

Voodoo3 has some “problems” with texture env. mode GL_BLEND.

What I’m trying to do is to have an envmap and use the alpha to control the transparency of that envmap, which means that I would have to reconstruct the texture every frame, not only that but for a large amount of envmapped polygons I might need a HUGE amount of texturememory to support all the variances of envmap + alphamap for just that frame.
Anyway, I found out that the problem was that I hade constructed my settings from the OpenGL 1.2 specs, while 3dfx only seems to support 1.1, but even when using the GL_EXT_texture extension (which should define the all nessecary glTexEnv settings) the problems occured (still only on 3dfx cards), so I solved it by converting all alpha textures to RGBA textures with RGB = 255,255,255 (suppose I’ll have to add a flag there, so not everyone has to waste the extra texturememory just because of the problem with 3dfx cards) and use GL_MODULATE for both texture 1 and 2.