My application runs much much slower on Linux platforms(RedHat and Suse) with the same graphics card.
My machine is dual boot.
Are there any environment variables that I can tweak to enhance performance?
Any advice will be very much appreciated.
are you sure the problem is the opengl driver, not in the compiler? are you using optimization flags (‘gcc -O3’, for instance)?
Not sure if it’s the opengl driver or the compiler.
But the difference in the performance is so huge,
the compilers(Visual C for Windows and Absoft for RedHat Enterprise) I am using do not explain the gap in the performance.
I do not use any optimization on the slow linux.
i think you should create some simple programs using different techniques and compare their performance on both OS.
start with a very simple program which only clears color and depth buffer. its performance should be nearly the same on both OS, since clearing and swapping buffers takes place on the graphics hardware and cannot be influenced by the driver.
then increase the program’s complexity by either using display lists, vbos, drawing a huge number of polygons etc. try to get to the point where the performance starts to differ.
btw can you say a bit more about your app? is it pure xlib/glx or does it use lesstif/motif? what techniques does it use (lighting/texturing/display lists/vertex arrays/vbos)?
finally: what does glxinfo say?
At glxinfo | rendering,
“direct rendering: No”
How do I get “Yes” to the glxinfo?
Install the appropriate drivers for your graphic card and you should get rid of all your problems.
I see. I will install the right driver.
But I am still surprized by the fact that my HP linux workstation does not have the driver bacause I bought it as linux preinstalled.