Argh. I’ve said this before, and i’m sayin’ it again.
In their documentation, even SGI conceed that there is no mechanism in openGL to query h/w support for a given function. Granted, you can ask it if the driver KNOWS about a function, but that’s only so the driver won’t throw its hands up in horror when you try to do something it has no idea about. It says nothing about the extensions been done in h/w or not.
The only answer, again, from SGI’s tech stuff, is to profile the machine. But! If you think about it!! It makes perfect sense. At the end of the day, your app doesn’t care if function X is supported in h/w or not; ALL that you’re interested in is whether it adequately supports function X.
Using your example, suppose you wanted to make a decision to retain or axe mipmapping. So, you run two (toy) benchmark programs, one with mipmapping and one without. Suppose you get 25 fps WITH mipmapping, and 35 fps WITHOUT it. Now, do you REALLY care if mimapping is done in hardware?? I would argue that the answer is no: the results demonstrate that the machine, with whatever video card and drivers, can mipmap quite adequately (if 25fps is acceptable to you, that is). Now, suppose you run this program on antoher machine, and get 5 fps with mipmapping and 25 fps without it. What do you use? What if I told you that this second machine was a ye olde voodoo1 card running on a Pentium 1, and the first machine was a poxy card with no h/w support but running on an P3? Does the fact the second machine have h/w suport (supposedly) make any difference??
my 2c worth!