gl1.4 && ARB_texture_env_crossbar && NV_texture_env_combine && Nvidia


while reading the spec for OpenGL 1.4 I learned that the GL_ARB_texture_env_crossbar extensions is now core.
As many know NVidia doesn’t support the ARB extension because of the handling of referencing a texture unit without a valid a texture bound.
The ARB extension says that texturing for this unit should be disabled but the NV extension says that a white pixel should be generated. How is that handled on NVidia cards ?
In their spec for the NV30 extensions on the table at the begin there is a reference to an annotaion for the cross_bar extension, but the annotation is not there .
has anyone an idea how this will be handled ??


At first sight I’d check what extensions are supported byt the drivers.
If there’s only NV_texture_env_crossbar or only ARB_texture_env_crossbar then you’ve got the answer.
If both ARB_texture_env_crossbar and NV_texture_env_crossbar are supported then there’s either an incompatibility or there are different tokens for each extension. That is, maybe in addition to GL_TEXTUREn_ARB there is GL_TEXTUREn_NV ?

BTW, are you speaking about NV_texture_env_crossbar or NV_texture_env_combine ?

Edit I’ve searched a bit and been unable to find NV_texture_env_combine or NV_texture_env_crossbar.

[This message has been edited by vincoof (edited 09-05-2002).]

The GL_NV_texture_env_combine4 extension provide functionality similar to GL_ARB_texture_env_crossbar. The problem however is that there’s the small difference in specification as descrived above and there’s no way the driver can know which extension the application is trying to use.

[This message has been edited by Humus (edited 09-05-2002).]


Have a look at the .pdf on this page,

There’s some infomation on ARB_texture_env_crossbar specific to NVIDIA.

The problem that we had with ARB_texture_env_crossbar is that its behavior w.r.t. disabled or inconsistent textures was contradictory to that in NV_texture_env_combine4 (or whatever it’s called), as pointed out in a previous post in this thread. So NVIDIA couldn’t support one without breaking the other.

This is kind of a stupid case – I’m not sure that anyone would care all that much what happens if you use an inconsistent or disabled texture unit.

In OpenGL 1.4, the language was relaxed so that the results of this case were undefined.