GL_ARB_texture_env_combine problems

Hi I’m trying to use GL_ARB_texture_env_combine for the first time and I’ve run into some problems. As I understand it I should be able to setup a texture environment using this extension for each texture unit available. On my GeForce4 I believe this means I should be able to setup exactly four texture environments.

However it seems if I setup more than two I start to get results I don’t expect for the last two texture environment stages? The textures bound to those stages appear to be sampled as a constant colour which they are not. glGetError() is not returning any problems, I have the latest public drivers I think (43.65). And unless I’ve misunderstood how many texture environments are possible on the GeForce4 I believe I am setting everything up correctly.

Has anyone else experienced similar problems?


Originally posted by ssmax:
On my GeForce4 I believe this means I should be able to setup exactly four texture environments.

This is true for the GeForce4 Ti series. If you’re either using a GeForce4 MX or a GeForce4 Go, you will be limited to two texture units.
You can check the number of available texture units by querying glGetInteger with GL_MAX_TEXTURE_UNITS_ARB .hth

Yeh its a GF4600 and is saying it supports 4 texture units so I’m slightly baffled.

I’ve even tried setting a replace texture environment in the last stage with the source coming from GL_TEXTURE and the operand as GL_SRC_COLOR and the texture doesn’t come out correctly.

Mmm… I’ve just had a sudden thought that it may be the texture co-ordinates. I’m using vertex arrays and by default the texture co-ordinates from each stage come from the first set don’t they?


As long as you don´t give gl a texcoord for each unit i would expect them to be anything.
However if you still have problems, it might be usefull if you post the code, with which you setup the environment.






Interesting suggestion but that shouldn’t make any difference Anyhow found the problem - being idiotic I didn’t realise that I had to explicitly client state enable the texture coordinate pointer when using vertex arrays for each texture unit. Ooops. Works now though thanx.