Hardware or software?

The main reason software mode is slow is because of bilinear filtering. It is slooow to do that in software. Gee, wonder why no old DOS games used bilinear filtering? :stuck_out_tongue:
Microsoft’s software renderer sucks, both Mesa3D and SGI’s software renderer are faster. But, once again, software rendering is dying. It might be revived in a few years when we have “extremely fast” (compared to now) processors, but we’re not there yet.
Besides, Quake ran at 17 FPS on my old Pentium 75, Quake3 runs at over 60 FPS on my P3 700 so i don’t know what you’re talking about WRT slow animations compared to the DOS days.

Software MODE, are you refering to the PFD_GENERIC_ACCELERATED-thingie when creating the pixelformat?

No, thats not what I’m talking about.

Then, on my GeForce, I start using 3D textures which is not supported in hardware.

THat’s what I’m talking about, that is, in while running in accelerated mode, you choose something which is not supported by your card. Software mode kicks in, and so opengl32.dll determines that we should use software mode because that feature in hardware is not implemented.

So, it’s opengl32.dll that determines wether the driver is supposed to do software or hardware rendering? Somehow it has to know about it. How? If opengl32.dll knows about it and is able to tell the driver, why can’t it tell us aswell? Sorry for this somewhat off topic discussion here, but I find it hard to believe it is opengl32.dll that tells the driver what to do internally.

I’m pretty sure that it’s not OpenGL32.dll that determines this, but the ICD. OpenGL32.dll just works as an interface to the ICD, when rendering through the ICD. I still think that the ICD should know when it’s rendering in software, and be able to return this to us - even though it may not be trivial to define what “software rendering” consists of. However, Matt says that there are other reasons why it’s a bad idea. I wish I knew what…