Hello everyone. I’m using the NVStrip library and I’ve came across this strange behaviour. On a GeForce 4 4600, with a resolution of 640x480 (with Quincux AA) my app runs at 18 fps without strips and 20 fps with (with ~460 KTris rendered). When the resolution is 1280x1024, my app runs at 17 fps without strips and 16 fps WITH (and not the other way around) !!! My geometries are rendered with VAR on Video memory (128Mb available). The thing is, using the nvstrip library creates a lot of degenerated triangles which increase the overall amount of triangles sent by approx 10-15%. Could this degenerate triangles be the explanation to the weird relative drop in performance when using the strips with a high resolution?
Thanks in advance for your advice!
[This message has been edited by Olive (edited 09-20-2002).]
Somebody told me that this may be due to the fact that degenerate triangles at low resolutions are not rasterized because of their zero surface while at high resolutions, imprecisions due to the usage of VAR (??) yield non zero surface triangles in which case even degenerate triangles are rasterized.