OpenGL Hardware Rendering

I read through the thread of messages about hardware rendering on January 8th if I remember correctly. I’m using a Voodoo3 3500TV card and am dabbling in the fine art of OpenGL graphics. I downloaded the one file that was suggested by a user (OpenGL32.DLL) and when I ran my program, I got the cool 3DFX logo come up and my program went to full screen (sort of - looks like it appears in 640x480 mode), and I was able to move around in the little world I built. But I noticed no improvement in the actual speed. I’ve built in a little piece of code that returns my frames per second, but no difference can be seen. The worse part is that now I only get approximately 7fps and that cannot be HW rendering. Before I put in the new DLL, no matter whether I was running it as a tiny window, or full screen (maximized window), I’d get approximately 15fps. So obviously I’m missing something here. I’d like to have my OpenGL program HW driven, so any assistance to that end would be appreciated.


I think that you only has hw support in 16 bit color depth so check that. Do you really have to rename dlls? I belive that the problems is with vodoo1&2.

voodoo 3500 only supports 16bit color 3d acceleration, and I believe it is not compatible with old custom opengl32.dlls released for the voodoo1 and 2, so be carefull of what you download.
In fact, voodoo 3500 should have its full opengl icd, so you shouldn’t need to download nothing more than the latest 3dfx drivers.
Not 100% sure, maybe a voodoo 3 user could tell better.

I think that one or more of this things might be happening to you:
-you are trying to install a MiniGL driver, which is not needed by the voodoo3500.
-you may have your desktop in a non-16bit color depth. Voodoo3 only renders to 16bit color.
-you dont have the latest drivers from 3dfx site.

Easy way to find who is rendering: see glGetString(GL_VENDOR) and glGetString(GL_RENDERER). If they aren’t something like “3dfx” and “voodoo”, you are not using your card for 3d (“microsoft”/“gdi generic” or something is for the software renderer).

[This message has been edited by coco (edited 01-10-2001).]

I checked my desktop settings, and it is set at 16 bit colour… 1024x768 resolution. I do also have the most up-to-date drivers for my card.
Now I checked my windows/system directory and found the following files that seem to be related to my 3DFX card:

The one file in there “3dfxOGL.dll” looks to be the closest match to anything relating to 3DFX and OpenGL. Any ideas?

Also, maybe I’m missing something here. Do you have to be running in a specific resolution at a specific colour depth before hardware rendering kicks in? I was playing around with Glide doing some testing. I ran a test that used DirectX, and I got approximately 55 frames per second. When I ran the test that used Glide, it said that I was running at approximately 183 frames per second. The difference between the two was that the DirectX test ran in 800x600 mode, while the Glide test ran in 1024x768 mode. It may be nothing, but I don’t know.

What do you think? Is there no hope for me? hahaha.


I’ve had a similar problem with voodoo2, I don’t know if what I’m going to tell you will be helpful…

Some 3dfx cards have a driver (3dfxvgl for v2) that is the full implementation of the opengl library.
Unlike the other cards, calls to opengl32.dll (that are made because the program was statically linked with opengl32.lib) aren’t automatically redirected through the appropriate driver. Then you have 2 solutions:

  • rename 3dfxvgl.dll to opengl32.dll and put it in the app path
  • dinamically load opengl32.dll or 3dfxvgl.dll with LoadLibrary after detecting a 3dfx card
    The first approach is simple but won’t work if you give your app to someone with a 3dfx and no file renamed.
    The second is a bit harder but works always.

The opengl32.dll you downloaded may be a Mesa implementation.
However, try glGetString(GL_VENDOR) after you activated the rendering context: is the best way to know if you are passing through the generic implementation or the 3dfx driver.
I hope this may help you…

[This message has been edited by Teofuzz (edited 01-11-2001).]

I put in the code to tell me who was handling the rendering (GL_VENDOR and GL_RENDERER as suggested by COCO) and got the following information returned to me:

3DFX Interactive Inc.
3DFX/Voodoo3 ™/2 TMUs/16MB SDRAM/3DNow!/ICD (Jun 5 2000)

So it appears as if the video card is doing the rendering, but another question, why would hardware rendering give such a bad frame rate at such as 7 frames per second? I then copied the one dll file, 3dfxOGL.dll, and put it in my app’s directory and renamed it to opengl32.dll, and when I ran my program, I got an illegal operation, so I guess it wasn’t what I thought it was. Hmm interesting.

Could someone also explain to me what TMUs and ICDs are?


I would advise you not to mess with those dlls that way. Make sure the only opengl32.dll you have is the one that came with windows, and is located in the system dir. 3dfxOGL.dll is surelly not an equivalent of OpenGL32.DLL, even if it is the implementation of opengl for your card, so it’s not a surprise you get crashes.
By what you posted, your drivers seem to be working ok.
You may be getting such low performance in hardware if you do some nasty stuff (like using glDrawPixels, glCopyPixels, or even glTexImage2D every frame with lots of textures, etc…). Check out that your code is optimized enough.

Could someone also explain to me what TMUs and ICDs are?

Texture Mapping Units & Installable Client Driver.
The first refers the ‘chips’ on your card, the second the type of the drivers… the most are icd, I think…

I supposed that renaming with v3 won’t work…
Try to compile and test a programs that renders only a quad, without depth buffer, smoothing, etc…, then if you can get good supersonic framerates look at your code.


PS: with v2, enabling depth buffers kills framerate from 60 to 0.01 that’s not a good thing… soon I’ll post a topic for this…