stencil buffer works on one machine but not on another


I am using stencil buffer to mask out a certain region. It works perfectly fine on my lap computers; however, when I run it on my laptop, it seems nothing happens. My laptop is using an nvidia GeForce Go 6200. could this be related to hardware?
I think stencil buffer is also supported by opengl, even it is not supported by the hardware, right?
Any information will be appreicated.


Debug your pixelformat selection code on the laptop and check if it gets stencil bits.
If not, check the color resolution of the display. Some boards do not support stencil in highcolor at all, most do not support stencil with 16 bit depth buffers!
Request a 24 bit depth buffer and 8 stencil bits, that should work on most boards.

It’s also a good thing to have bpp equal in pixel format and in display mode. So, first change your display mode, and then search for pixel format. In different desktop color depths you can get different set of pixel formats.
Also, rendering in 32bpp on 16bpp desktop can cause severe performance drops (at least this was the case with my RIVA TNT2 under Windows 98 few years ago).
Choosing proper pixel format is actually a multi-criteria decision problem, so professional application should never rely on ChoosePixelFormat. :]

Thank you for your help. Please see my detailed explanation as follows:

What I basically want to do is to chop a small triangle
area off a big triangle. I use stencil buffer to do this.
The following is the code with the comments.


//init stencil buffer
glStencilFunc(GL_ALWAYS, 0x1, 0x1);

//turn off color buffer so that the small triangle
//to draw to set the stencil buffer will not show up

//draw the red smaller triangle to set the stencil buffer
glColor3f(1.0, 0, 0);
//the big triangle is at depth -22	
glVertex3f(-1.0, 0, -22);
glVertex3f(1.0, 0, -22);
glVertex3f(0, 1, -22);
//draw the blue big triangle, of which the part overlaps with the
//smaller triangle is chopped off 
glStencilFunc(GL_NOTEQUAL, 0x1, 0x1);
glStencilOp(GL_KEEP, GL_KEEP, GL_KEEP); 
glColor3f(0.0, 0, 1.0); 
//the big triangle is at depth -21
glVertex3f(-2.0, 0, -21); 
glVertex3f(2.0, 0, -21); 
glVertex3f(0, 2, -21); 


The code works perfectly fine on ATI graphics cards; however, on
nVidia cards, it does not work, nothing is chopped off the big
triangle. However, if I change the z-value of the big triangle from
-21 to -23, which makes the big triangle behind the smaller one, it
works on nVidia cards. In both cases, ATI cards work correctly.
Therefore, this makes me to belive for nVidia, the stencil buffer
is somehow associated with depth buffer. Nonetheless, for standard
opengl, stencil buffer is simply a 2D mask.

Any help is highly appreciated.

Before you try using stencil buffer try this:

GLint stencilBits = 0;
glGetIntegerv(GL_STENCIL_BITS, &stencilBits);
if (stencilBits < 1)
  MessageBox(NULL,"Life sucks without a stencil buffer.
","Stencil test", MB_OK);


Hi k_szczech,

Thanks for you help. It actually turns out that “stencilBits” returned is 0, which means there is no stencil buffer on the graphics card that I am using on my laptop, or at least no stencil buffer is detected. However, this is also very strange. My laptop has GeForce Go6200, and it is hard to believe it doesn’t have stencil buffer. Moreover, I tried this on our lab computers with nVidia Quadro GPUS, and there is no stencil buffer detected either. I also updated to the latest drivers, and everything is the same. Do you have any suggestion on that.

Thank you very much.

Read all answers again!
If you don’t get stencil bits check your pixelformat selection routine.
That’s the code which contains the struct PIXELFORMATDESCRIPTOR and the functions ChoosePixelFormat() or DescribePixelFormat() and definitely SetPixelFormat().

After you have chosen a pixelformat index call a DescribePixelFormat() on it and look at the number of stencil bits in the PIXELFORMATDESCRIPTOR structure with the debugger or print it. If you have one without stencil bits, don’t pick it.
It’s simple to look at all exported pixelformats with DescribePixelFormat. Read the manual about it:

Hi Relic,

Thank you very much for your detailed explanation. The reason that I want to stay away from what you described is that I am reluctant to make my program platform dependent.

Actually, I think I almost figured the reason with the help from you guys. The window system I am using is QT. Again my reason of using it is because it is platform independent. In QT, the opengl thing is handled through QGLFormat and it has a member function called setStencil, which is used to turns on/off the stencil buffer. In my case, for the ATI machines, when I use setStencil(true), the stencil buffer is turned on; however, on nVidia machines, setStencil(true) has no effect. The bits of stencil buffer is still 0. I am wondering if this could be because QT has some incompatibility with the nvidia driver on this issue.

When your application runs on a particular platform, it is necessarily platform dependent - insofaras the windowing system is concerned, in particular. Whether you deal with the details of the OS window creation process yourself, or some library does it for you, is up to you, but the details are the same regardless, and covered quite fully in this thread and the linked documentation.

If your problem centers on the use/execution of QT, then I suggest that you refer to its documentation, or submit a detailed bug report to the authors of QT, including a small, concise reproduction application, along with your hardware/system information and driver version.

Hi Guys,

Thank you all for the replies. I finally figured it all out. I didn’t put setStencil() in the right position and use it in the right way. Once I used it in the right way and put it in the constructor, it works. Man, I can finally proceed to work on something else now. Whew…