I’m experiencing a problem when using transform feedback on machines with two SLI connected graphics-cards. I have tried two GTX580 and two GTX280m with the latest nvidia drivers.
During runtime several transform-feedback recordings are performed in which I fill vertexbuffers using a geometry shader. The correct creation and the following primitive query both seem to fail.
The resulting effects are vertexbuffers with partially incorrect data and array accesses exceeding the vertexbuffer bounds (because of the wrong query results). Also, the vertexbuffer in question can only be seen in every second frame, which produces a flickering inbetween two frames. Everything is working fine in non-SLI mode.
Do I have to take any precautions when using SLI? Could this be an Nvidia/OpenGL driver issue?
I was able to reproduce to problem with an nvidia OpenGL SDK Sample.
The sample “Transform Feedback Fractal” suffers the same broken vertex buffer and frame-flickering when run in SLI mode.
The code can be downloaded here:
The only thing I changed is to replace the line
sprintf( temp, "%dx%d:%d@%d", 1280, 1024, 32, 60 );
glutGameModeString( temp );
in order to run the sample in a true fullscreen and thus activate SLI.
I also recommend to enable the vertical nvidia SLI bar for this test. This is allows you to make sure your SLI is active.
Now, during runtime press ‘c’ to switch to continuos subdivision. Using the ‘s’ you can now reproduce my behavior of a transform feedback during runtime. The result is, again, a broken vertexbuffer on one of the cards.
This gives me even more reason to believe, that this might be due to a driver bug.
This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.