Degenerate Triangle Strips

Hello,

I’ve read up different opinions on triangle strips and degenerate triangles. Due to the structure of my mesh, I can’t implement the indexing with a single tri_strip without using degenerated tris. It works fine with degenerated tris but I’ve read that it can actually hamper performance on certain video cards.

My question is: am I better off simply creating seperate triangle strips or should the degenerate method not cause any problems?

Note the degenerated tris are visible in wireframe mode (you can make out the slivers), yet everything looks fine in fill mode.

Thanks for reading!

nvidia says don’t use degenerates. Download nvidia’s stripping library and try it out on your own.

“Due to the structure of my mesh, I can’t implement the indexing with a single tri_strip without using degenerated tris”

Of course not. You’ll have to use hundreds of tristrips. Just apply the vertex array and then go through all the strips one at a time. Check out nvidia’s stripping lib, it gives you back a nice collection of strips to draw so you can forget about using degenerates.

Hi titan,

thanks, I’ve got it sorted it out now. I don’t need the NVTriStrip utility as I’m generating the index buffers on the fly as a prebuild step (for terrain rendering) I just needed to be sure that it wouldn’t make certain video cards crash.

The overhead decrease is pretty large (almost halved) from using tri lists.

Anyhow there seems to be a lot of mixed opinions about degenerate triangles in a tristrip. Several NVidia employees have commented on other threads stating that it’s 100% the way to go as it’s very fast for the card to detect if a tri is degenerate (2 clock cycles) yet it still does get put in cache. The best answer is to use them for most cases unless the mesh is too disconnected (meaning there would be a high ratio of degenerate tris to valid tris) then simply opt for a triangle list.

NVIDIA doesn’t have a clear opinion about this matter… once they say to avoid degenerated triangle then the second after they tell you to use them.

Even if they’re almost free on GeForces (and probably Radeons too), I still don’t like them.

Note that NvTriStrip still generates degenerated triangles to make its job easier even if you don’t ask it to put all the strips together.

When dealing with a lot of tri strips there is more overhead than with triangle lists, so you should try to use optimized methods: display lists, NV_vertex_array_range, EXT_multi_draw_arrays, NV_primitive_restart, etc…

Degenerated triangles can be seen in wireframe mode because the 3D card cannot tell which direction they’re facing and so it can’t do back face culling. This can be annoying sometimes even if most of the time it’s fine.

There is no perfect answer, you should try both solutions and see which one fits the best your needs.

[This message has been edited by GPSnoopy (edited 12-03-2002).]