Strange Rendering Artifacts

Thanks in advance for any help.

After building my Opengl/c++ app in Visual Studio 2003 I get very strange rendering artifacts.
It seems like many triangles fail to render if there is something else rendered behing them. The triangles that fail vary from frame to frame so there is massive flickering all over.

The identical code runs fine on windows if it is compiled with gcc-mingw. It also works fine when compiled and run on linux.

I thought this must be some kind of z-fighting or depth buffer precision issue, but it seems that the buffer is being set to 24-bit. I have tried disable back-face culling and enabling depth_test all over, just to see if there is an effect. I have also tried bringing in the far-clipping plane and pushing out the near plane. I also tried disabling fog.

I can’t figure out why it would matter which compiler is used. This is driving me nuts.

Thanks in advance for any suggestions… :eek:

did you try
glDisable( GL_STENCIL_TEST )?
have you seen it on more than one graphics card?

Let’s assume the code is correct for the gcc compiler.
I’ve seen similar problems before. Normally it happens if vertex array data gets screwed, indices wrong etc.
Make sure your code is not using any hardcoded sizes, strides, pointers. Carefully examine every memory movement of the geometry data.

Also check for data struct size. For example
MSVC[sizeof(TheStruct)] != GCC[sizeof(TheStruct)]
depending on compiler settings (because of mem align). So, check your map loaders (if you have struct read/write in code), check stride in gl*Pointer call, check struct offsets, check char type (some compilers treat it as signed and others as unsigned).

Use offsetof macro also…

#define offsetof(s,m)   (size_t)&(((s *)0)->m)

offsetof(MyStruct, structmember)

instead of hardcoding offsets.

It seems like many triangles fail to render if there is something else rendered behing them. The triangles that fail vary from frame to frame so there is massive flickering all over.

You could always keep it like that and call it “art.” :smiley:


Thanks for all of the replies.

I am still having problems. I have the problem on both of the cards I have tried( Ti4600 and Go 4200). I have tried disabling VBO, same problem.

I don’t think it is a problem in the vertex buffers, as I load these only once - yet the flickering appears in random triangles. In other words, any given triangle is rendered correctly in on frame but perhaps not in the next. If the error was in the load, wouldn’t the same triangles be consistently wrong?

I think I will try commenting out large chunks of the application to see if I can spot the culprit.

Thanks again for your replies…
Any additional suggestions are always appreciated.


Well I figured it out!!!

And it was definitely a z-fighting issue. I was loading the near plane from a “Setting” class that loads a Lua script. Here is the psuedo code:

const char* strtemp ;
strtemp = getSettings()->getSetting(“Near”).c_str();
gluPerspective(etc.etc.etc.using nearplane);

well it seems that the STL string in gcc can be stored in a const char* but NOT in Microsoft’s STL. Anyway, clearly my code had issues - I don’t know why I was caching the result in a temporary string…grrrr.

Hence, the near plane was 0 and I had massive z-fighting.

Oh well, there goes quite a few hours down the tube.

But at least I finally got it - and it was definitely my code that caused it…grrrrr…

Happy coding!

Nice to see you figured it out.

A bit strange to see you declare a constant, then immediately change it. I would have expected at least a compile-time warning - if not a compile-time error over that.

Maybe you meant it to be a static variable, rather than a constant?

T101: That’s the chars strtemp points at which are const, not strtemp itself (it’s ‘const char *’ and not `char *const’). The code is correct.