Hello!
Yer, ATI have a readme file regarding their support for pixel buffers. Apparantly SGI didn’t release the GLX specification for GLX 1.3 and so ATI can’t “officially” support pixelbuffers. I don’t know how nVidia manage it, though. 
Anyway. Although ATI’s GLX won’t list pixelbuffers in their GLX extension registry, they still expose pixelbuffer function hooks. It’s magic, surely. I have found, however, that their support has issues. Seriously, it’s quite strange. I’ve had to add a test in my code to see if I’m using an ATI card and do some things differently if I am, for example
-
AGLX expects the token None to terminate their GLX pixelbuffer list; GLX_NONE doesn’t work. Some code i’ve seen, and mine initially, use GLX_NONE. I can’t remeber exactly what happens… I think AGLX can’t find any compatible pixel formats if you use it. Both None and GLX_NONE work on nGLX.
-
I can’t remember the story here, either, but here is a code fragment for creating a new context (compliant == true if using a non-Radeon card)
if(compliant) {
context=glXCreateNewContext(display, config, GLX_RGBA_TYPE, share, true);
assert(context);
} else {
/* bugged Radeon drivers =( */
XVisualInfo *visinfo=glXGetVisualFromFBConfig(display, config);
assert(visinfo);
context=glXCreateContext(display, visinfo, share, GL_TRUE);
assert(context);
XFree(visinfo);
}
- glXMakeCurrent’s return code is crazy. It seems to return false even if the change is made. Code fragment #2:
bool ok=glXMakeCurrent(display, pbuffer, context);
assert((compliant && ok) | | !compliant);
- querying pixel buffer dimensions just doesn’t work. It returns nonsense values. I submit code fragment #3:
#ifndef VIEWPORT_SIZE_HACK
Display *currdisplay=glXGetCurrentDisplay();
GLXDrawable currdrawable=glXGetCurrentDrawable();
glXQueryDrawable(currdisplay, currdrawable, GLX_WIDTH, &w);
glXQueryDrawable(currdisplay, currdrawable, GLX_HEIGHT, &h);
#else
/* … so we query the drawable’s size instead */
GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);
w=viewport[2];
h=viewport[3];
#endif
Now, I say that ATI have it wrong–rather than nvidia–because I wrote my pixelbuffer class using AIX’s GLX specificaiton. The code also works on my nVidia card–it just has issues (and hence, needs the checks to get around them) when I run the code on my ATI card at Uni.
So, the short story is: yes, ATI support pixelbuffers even though they’re not there, but be aware that they have quirks.
enjoy!
cheers,
John