Auto derivative problem for Nvidia GPU (ATI is OK)

I created a demo for distance field contour texturing (Green’s method from Siggraph 2007), but ran into a strange bug. ATI hardware behaves like expected, but on Nvidia hardware, the dFdx() and dFdy() functions, and the related fwidth(), behaves erratically.

Please have a look at the demo linked from:

http://contourtextures.wikidot.com

(Windows, Mac and Linux versions are available.)
The strange and nasty artefacts occur when the textured plane is rotated out of the screen plane, using ctrl-vertical mouse drag in the demo.

Any ideas what causes this? I would love to present a demo that works on both platforms, and I see no reason why it shouldn’t work. What am I doing wrong? The problematic shader is in the file “fragment1.glsl” in the demo archive.

I think I wrote about this when this post was on opengl.org front page but comments doesnt seem to work (or behaves different to what I thought)

Sorry not much help, but have you tried disabling aniostropy?

I would test but the ‘application controlled’ checkbox in the nvidia driver panel refuses to be unchecked, driver bug?

I experienced differences in dFdy() on ATI vs NVidia. The latter did it right according to the spec, while ATI did reverse. I’m not using this derivatives any more…

Toyed a bit with the glsl code, couldn’t find ways to fix it; the dFdy’s sign doesn’t matter here naturally.
I suggest you make a fixed-case scenario (a slight fixed tilt, image 4 getting loaded) and debug via RGBA32f render-targets.

The dFdy sign is flipped on ATI in case you render into a FBO. Rendering into a window, dFdy works as expected.

I filed this as bug some time ago. I hope to get it fixed in a future driver version not all too far from now :wink:

Thanks for the suggestions. Pity that the auto derivatives are wrong on ATI, but as I said, this problem is with Nvidia hardware, and as pointed out above, the sign doesn’t matter here. I guess I’ll have to look closer into what actually happens. My main problem for debugging is I don’t have easy access to Nvidia hardware.

Anisotropic analytic antialiasing is all about auto derivatives. I see no alternative to using it. Oh, well.