Graphics card capabilities for DEVELOPERS

I was curious, do the major graphics card manufacturers provide spec sheets for their hardware that are intended for developers?

Basically, I want info that isn’t some drummed up marketing/back of the box hype.

You know good hard stats like:
16 texture units
It supports <X> opengl feature in hardware

Like I would want to know, does this hardware perform glBlendFuncSeparate in hardware… does it have a hardware accelerated accumulation buffer…

From what I’ve been able to see, neither nvidia or ati seem to have real specs like this posted anywhere on there site, am I just missing it?


Dunno’bout Nvidia (as their developer-page is much huger as ATI’s, I guess they have such a thing too), but ATI have a document called “Radeon 9700 OpenGL Programming and Optimization Guide” which should be what you’re looking for (at least for the R3x0-architecture). It’s 13 pages long and is very informative for everyone developing on that kind of HW. It tells you for example what shader instruction need what resources, what datatypes are supported, and so on.
You can find it in their (rather small and else disappointing) GL-SDK.

Edit : Here’s the link to the GL-SDK …their developersection is a real mess.

Check the OpenGL Hardware Registry for the extensions each card supports.

But really, if you’re a developer, purchase as many different cards as your QA budget can afford. If you’re pushing the hardware, you’ll run into different performance characteristics for each card and driver bugs.

Nvidia has the same kind of documents but at this time I think it is only available to registered developers.

Take a look a this PDF file (~2MB):

There’s a table where you can see the extension support.