I just saw on the Linux box of one of my friends the tri stripper demo (
The thing is that it was just like the depth buffer was disabled. (see gl_mesh.cpp and gl_renderer.cpp for the OpenGL code part)
But when compiled and executed on a Windows box it displays correctly, or even on a Linux box using Mesa lib.
So I’m wondering whether I’m doing something wrong or is it a drivers bug?
My friend’s PC is a P4 2.6 C, Asus P4P800, GF2MX, Debian “stable”, tried 44.96 and 43.63.
IIRC it was working correctly with the 41.91 on a P2 350, same Debian, GF2MX.
September 18, 2003, 2:35pm
I see same results… but never had similar problems with anything else on linux.
In fact it looks more like glCullFace(disabled)…
September 18, 2003, 2:55pm
I think there’s been a bunch of problems with the Linux drivers and GF2MX. Is the app using VBOs? I’ll test it on my GFFX tonight. But it’s probably best if you send an email to
email@example.com or the nvidia guys that frequent this board (Jason Allen, Cass ).
[This message has been edited by PK (edited 09-18-2003).]
September 18, 2003, 3:17pm
Is not a GF2MX issue… I have a ti4200 and have same problem. In fact I had same problem with one program of mine some time ago( OK on windows and MESA)…
Will try to find it here in my HD or in my backup CDs, it may help identify the problem.
Originally posted by PK:
Is the app using VBOs?
Nope, no VBOs. Only regular VAs, but within a display list.
False alarm, guys.
Seems like an absent-minded programmer forgot to put a GLUT_DEPTH flag in my program (I really wonder who he is).
Fortunately I didn’t send any report to NVIDIA yet.