Who is responsible for the number of buffers available in the framebuffer? Is it the video card, the OS, or a combination of the two? I need to have a ‘system’ that has as many of the OpenGL buffers as possible(see Red Book, page 195, ‘glDrawBuffer’). With our current Windows 2000 workstation, we have access to:
If you read the fine print, this actually means we have only two buffers: GL_FRONT_LEFT and GL_BACK_LEFT. I believe this would be classified as a “Monoscopic, double-buffered” context if DOUBLE_BUFFER is specified at initialization.
Note we have - N O - auxiliary buffers. Would really like to have 4 auxiliary buffers. So the question is:
What OS and video card should we obtain to have a “stereoscopic context” and at least 4 auxiliary buffers? In short, we need some kick-butt video capabilities.
Thanks in advance.
The number of buffers is limited by the vendor specific OpenGL implementation which is part of the graphic card driver.
On consumer level hardware (Geforce/Radeon) you usually only get front and back buffer along with the Z- and stencil buffer (which may be weaved into the color buffers). The Accumulationbuffer is still emulated in software for most consumer cards.
For true doublebuffered stereostopic you need an professional OpenGL accelerator like the Nvidia Quadro, ATI FireGL or 3DLabs Wildcat series cards.
But even the pro cards dont offer aux buffers because you dont need them anymore. You can use the render-to-texture extensions to generate an arbitary number of views very easily and because they are textures you can blend/wrap/whatever them in a very convinient way.
Thanks for the reply.
When I query my PC, it says:
Quadro4 750 XGL/AGP/SSE2
GL_ARB_depthtexture…ad nauseum - lotz of stuff.
Has a really lengthy list of attributes.
When I query our Windows 2000 work station, it says:
GL_Win_swap_hint GL_EXT_brga GL_EXT_paletted_texture
The NVIDIA Quadro 4 runs lickity-split, while the GDI Generic is pokey slow. And yet the NVIDIA Quadro is not stereoscopic. Do you know the model number of the NVIDIA card that is stereoscopic?
You seem to be having a programming issue.
This forum is for End Users (poeple who want to run games and apps that use GL)
Sorry I was out of town.
Well, every Quadro I know off supports stereoscopic OpenGL, this includes yours. I am currently uncertain whether you need to install an additional driver to enable the stereoscopic output or not. I am gonna double check on that.
As far as your W2k workstations are concerned they are stuck with MS software OpenGL implementation, the most common reason for this is really simple. You need to install the proper drivers from the manufacturer. The drivers that ship with Windows dont support hardware accelerated OpenGL and that causes the software emulation to kick in.
Its been a while since I used stereo so it did not occur to me at once, but you need to select an stereo pixelformat. If you are using GLUT you need to call something like:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_STEREO);
May I ask what you are trying to do? I ask that because it sounds like you trying to use the wrong methods to solve your problem.