VAR problem on NVidia for Mac

We’ve got a problem with our VAR code on NVidia hardware, using the Apple VAR extensions.

This problem does NOT occur with the same code using the NVidia VAR extensions on Windows (the only difference is the memory allocation).

This problem does NOT occur when using CVA’s. Only when using VARs.

The problem manifests itself as a “flashing” texture. You can see an example at:

Needless to say, the textures are not supposed to be flashing white like that.

We do use fences for synchronization.

Has anyone seen anything like this before?



For the archives, the problem was caused by not doing a glFlushVertexArrayRange after data had been modified.

On Windows, when using fences with the NV extensions, a flush is not required.

However, on the Mac, the flush is still required, even when fences are used.

Additionally, we found that on ATI hardware, all VAR data must be in floating point. We initially used unsinged bytes for our color data, but when you do so, it currently pushes you to immediate mode.

ATI is removing this restriction soon, but it will only affect (I believe) the 8500 and higher. The original Radeon and below do not have the hardware capability to handle mixed data types in a VAR on the Macintosh, due to byte-swapping which must occur.


[This message has been edited by wadesworld (edited 02-14-2003).]

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.