I am experiencing some very weird VBO Trouble, so this is a long story:
When working with a pretty heavy model (around 2.5 million Vertices, round about 4-5 Million Triangles) that consists of many surfaces (actually there are about 12000 OpenGL Commands executed while drawing the model, most of them caused by the nvidia CG API by the way), most of them between 500 and 1000 vertices, some of them have a lot more vertices (around 60000-70000). For drawing the geomtry I use VBOs with glDrawRangeElements on GL_TRIANGLES (no stripes yet), everything set to static draw and I am not doing any updates on the data. I have developed my application on a QuadroFX 3000 with 256 MB of RAM. When just using a simple opengl shading model this card performs at about 25 fps.
Now I tested the program on both, a Geforce 7800 GTX with 256 and with 512 MB. And both perform with about 5-10 fps!!! So, obviously there must be something seriously wrong. I also monitored the memory usage on the graphics card and while the RAM of the Quadro card gets filled up to about 190MB, the Geforce cards show a very irritating behaviour. While loading my models, shaders and all the textures memory goes up to about 180 MB but when I start to draw the first frame, the memory usage explodes and fills up the whole Graphics card RAM. When the maximum is reached, it flushes the whole memory and stays at 20MB usage, which looks like everything (textures, shaders, VBOs) is transfered to main-memory.
I am really a littlebit stuck with this, I already tried different drivers on the Geforce cards but all show the same behaviour. For me this looks like a very serious bug in the drivers since I don´t have the slightest idea what could cause this. And the Geforce 7800 GTX should be a lot faster than the Quadro 3000FX (which is a GeforceFX card basicly).
Does anybody have any ideas?