Getting GL_STENCIL_BITS returns 0

I wanted to know the value of GL_STENCIL_BITS on my computer, but I tried glGetIntegerv() with it, I got 0. I don’t think it should be 0 though. Has anyone had this result before, and found a way to get a reasonable number?

I made a bare minimum code file to show what I tried with glGetIntegerv. The stencil get code is before the while loop. I initialized the variable stencil_bits to 0 because when I didn’t, I’d get a value of 1 on a 64-bit build of GLEW and GLFW or a value like 1400537549 in a 32-bit build. 1 seems too small, and 1400537549 seems way too big.

#include <iostream>
#define GLEW_STATIC
#include <GL/glew.h>
#include <GLFW/glfw3.h>

int main()
{
	// Initialize GLFW
	glfwInit();
	glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
	glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
	glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
	glfwWindowHint(GLFW_RESIZABLE, GL_FALSE);
	GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);
	glfwMakeContextCurrent(window);

	// Initialize GLEW
	glewExperimental = GL_TRUE;
	glewInit();

	// Setup viewport
	int width, height;
	glfwGetFramebufferSize(window, &width, &height);
	glViewport(0, 0, width, height);

	// Enable stencil test, and get stencil bits
	glEnable(GL_STENCIL_TEST);
	int stencil_bits = 0;
	glGetIntegerv(GL_STENCIL_BITS, &stencil_bits);
	std::cout << stencil_bits;

	// window loop
	while (!glfwWindowShouldClose(window)) {
		glfwPollEvents();
		glfwSwapBuffers(window);
	}

	return 0;
}

Someone in an OpenGL posta long time ago said something about glutDisplayMode(....|GL_STENCIL_BUFFER), but I don’t know how to do that in GLFW. If this could fix the problem, does anyone know the equivalent function in GLFW?

GL_STENCIL_BITS isn’t a valid enumerant for glGetIntegerv() in OpenGL 3.3 core profile. You’re supposed to use glGetFramebufferAttachmentParameteriv() with GL_FRAMEBUFFER_ATTACHMENT_STENCIL_SIZE instead.

The GLFW equivalent of glutDisplayMode() is glfwWindowHint(), but the default is 8 stencil bits.

See this link. Search for “stencil”.

I tried glGetFramebufferAttachmentParameteriv() like GClements recommended:

glGetFramebufferAttachmentParameteriv(GL_DRAW_FRAMEBUFFER,
	GL_STENCIL, GL_FRAMEBUFFER_ATTACHMENT_STENCIL_SIZE, &stencil_bits);

std::cout &lt;&lt; stencil_bits still gave me 0 though. Oddly, the .exe I built returns 8 on a university computer and my friend’s computer, as expected, for both glGetIntegerv() and glGetFramebufferAttachmentParameteriv(). Could there be a bug in my drivers or something that I could report?

Anyways, thanks for the glfw function and the link.

This can be hardware-dependent. If your graphics hardware does not support a stencil buffer, then no amount of asking for one will give you one. It’s always useful if you say what your hardware is.

I suggest that you run the OpenGL Extensions Viewer then in the left-pane select “Display Modes & Pixel Formats”, and in the bottom “Pixel Formats” section use the numeric up-down control to scroll through the available formats on your PC, looking for anything with 8 stencil bits.

Possibly related:

If you want a stencil buffer, you should request it. For instance:
glfwWindowHint( GLFW_STENCIL_BITS, 8 );
glfwWindowHint( GLFW_DEPTH_BITS , 24 );

[QUOTE=Dark Photon;1286803]Possibly related:

If you want a stencil buffer, you should request it. For instance:
glfwWindowHint( GLFW_STENCIL_BITS, 8 );
glfwWindowHint( GLFW_DEPTH_BITS , 24 );[/QUOTE]

That would therefore be against the glfw documentation (as also stated by GClements). In this case, this is probably a bug whether in their documentation or their code… Because if the default hint stencil is set to be 8 but is not honored, and then an explicit user call to set the stencil hint will work, there is something wrong somewhere…

I tried those two glfwWindowHint calls before trying to get the stencil size, but I’m still getting 0 bits.

I’m pretty sure I have a stencil buffer. I was able to run the Stencil Testing tutorial on Learn OpenGL. Functions like glStencilOp, glStencilFunc, and glStencilMask work as expected.

I tried OpenGL Extensions Viewer, and I was able to find my stencil bits. WGL_STENCIL_BITS_ARB = 8, which I’m happy to know now. It was in Pixel Format 3, if that means anything. Pixel Format 1 and 2 gives 0 for that value. It’s weird to me how I can’t get a gl function to return something like 8 bits. Does anyone think it’s still possible?

Here’s my laptop’s hardware settings I got from the extensions viewer, as requested, if it helps.
System Info.
Renderer: Intel® HD Graphics 4000
Adapter RAM: 2048 MB
Monitor: Generic PnP Monitor
Display: 1600 x 900 x 32 bpp (60 Hz)
Operating System: Microsoft Windows 8.1
Processor: Intel® Core™ i7-3632QM CPU @ 2.20GHz, Ivy Bridge, Family 6h, Model: 3ah, Stepping: 9h

OpenGL
Version: 4.0
Driver version: 12.104.0.0 (28-Mar-13)

The Intel HD 4000 does have stencil support, and it’s possible to create a context with a 24/8 depth/stencil buffer using either native WGL (I don’t know about GL on other platforms) or D3D, so this seems a bug in GLFW’s context creation if anything.

I do recall that with some older Intels you will be given a 16/0 depth/stencil buffer if you request anything that’s not supported in hardware, but I haven’t seen this behaviour in at least 5 years.

Does GLFW allow you to enumerate the pixel formats and explicitly select one? Because unfortunately GL context creation can be loosely-specified and can be allowed give you a “best” approximation to what you ask for rather than an exact match (with a failure and meaningful error if it can’t), and it seems you’re falling into that hole.

one way to the around this problem is to use your own framebuffer, it also has some advantages that the defalt framebuffer hasnt, you can use texture attachments and read from those later in a fragment shader (if you want, for example to post-process the frame)

try this code: (it gives me 24 bits depth + 8 bits stencil)
(“me” means intel core i5 4430 + NVIDIA GT640)

#include <iostream>
#define GLEW_STATIC
#include <GL/glew.h>
#include <GLFW/glfw3.h>


 using namespace std;


 void CheckForGLError()
{
    for (GLenum error; (error = glGetError()) != GL_NO_ERROR;)
    {
        cout << "OpenGL Error:  	";
        if (error == GL_INVALID_ENUM)
            cout << "GL_INVALID_ENUM";
        if (error == GL_INVALID_VALUE)
            cout << "GL_INVALID_VALUE";
        if (error == GL_INVALID_OPERATION)
            cout << "GL_INVALID_OPERATION";
        if (error == GL_STACK_OVERFLOW)
            cout << "GL_STACK_OVERFLOW";
        if (error == GL_STACK_UNDERFLOW)
            cout << "GL_STACK_UNDERFLOW";
        if (error == GL_OUT_OF_MEMORY)
            cout << "GL_OUT_OF_MEMORY";
        if (error == GL_INVALID_FRAMEBUFFER_OPERATION)
            cout << "GL_INVALID_FRAMEBUFFER_OPERATION";
        if (error == GL_CONTEXT_LOST)
            cout << "GL_CONTEXT_LOST";
        cout << (char)7 << endl;        /*play sound*/
        cin.get();
    }
}


void glfw_error_callback(int error, const char* description)
{
    cout << "GLFW ERROR: 	" << description << endl;
}


void framebuffer_size_callback(GLFWwindow* window, int width, int height)
{
    glViewport(0, 0, width, height);
}


void Render()
{
    /* clear framebuffer */
    glClearColor(0, 1, 0, 0);
    glClearDepth(1);
    glClearStencil(0);
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);

    /* render here */

    /* check for GL errors */
    CheckForGLError();
}


int main()
{
	// Initialize GLFW
	if (!glfwInit())
        return -1;

    bool fullscreen = false;
	glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
	glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
	glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
	glfwWindowHint(GLFW_STENCIL_BITS, 8);

	GLFWmonitor* monitor = NULL;
	if (fullscreen)
        monitor = glfwGetPrimaryMonitor();
	GLFWwindow* window = glfwCreateWindow(800, 600, "OpenGL", monitor, nullptr);
	if (!window)
    {
        glfwTerminate();
        return 1;
    }

	glfwMakeContextCurrent(window);
	glfwSetErrorCallback(glfw_error_callback);
	glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);


	// Initialize GLEW
	glewExperimental = GL_TRUE;
	if (glewInit() != GLEW_OK)
    {
        glfwTerminate();
        return 2;
    }

	// Setup viewport
	int width, height;
	glfwGetFramebufferSize(window, &width, &height);
	glViewport(0, 0, width, height);

	/**
	If the specified framebuffer is a framebuffer object, attachment must be one of
        GL_DEPTH_ATTACHMENT,
        GL_STENCIL_ATTACHMENT,
        GL_DEPTH_STENCIL_ATTACHMENT,
        or GL_COLOR_ATTACHMENTi, where i is between zero and the value of GL_MAX_COLOR_ATTACHMENTS minus one.

    If the specified framebuffer is a default framebuffer, target, attachment must be one of
        GL_FRONT_LEFT,
        GL_FRONT_RIGHT,
        GL_BACK_LEFT,
        GL_BACK_RIGHT,
        GL_DEPTH
        or GL_STENCIL, identifying the corresponding buffer.
	**/
	GLint depth_bits = 0, stencil_bits = 0;
	glGetFramebufferAttachmentParameteriv(GL_FRAMEBUFFER, GL_DEPTH, GL_FRAMEBUFFER_ATTACHMENT_DEPTH_SIZE, &depth_bits);
	glGetFramebufferAttachmentParameteriv(GL_FRAMEBUFFER, GL_STENCIL, GL_FRAMEBUFFER_ATTACHMENT_STENCIL_SIZE, &stencil_bits);
	cout << "depth bits: " << depth_bits << endl;
	cout << "stencil bits: " << stencil_bits << endl;

	// window loop
	while (!glfwWindowShouldClose(window)) {
        Render();
		glfwPollEvents();
		glfwSwapBuffers(window);
	}

	return 0;
}

its important to check for errors (GL errors or others, like glfw errors), otherwise you dont know whats wrong with your code

I tried your code john, and I’m getting a GL_INVALID_ENUM error. I tried glGetError() right after both the glGetFramebufferAttachmentParameterivcalls you have, and I’m getting the same errors there. It’s the same thing in my own code.

Does this say anything? From the docs, this error would only appear if the 1st or 3rd argument to that function is invalid, but I’m pretty sure the arguments are correct.

You might try changing GL_STENCIL to GL_STENCIL_ATTACHMENT or GL_DEPTH_STENCIL_ATTACHMENT.

If those work, it’s a bug in the driver; GL_STENCIL is the correct parameter name for the default framebuffer; the other two are correct for a FBO.

I tried changing GL_STENCIL to the other two. It didn’t work. Ya, I’m pretty sure this is a bug in my drivers.

From mhagain’s comment about pixel formats, I can’t find anywhere that says you can set a pixel format in GLFW. But it gave me the idea to try to find a minimal win32 example of OpenGL where I could set the pixel format explicitly. I found a ‘minimal.c’ file here.

I was able to copy-paste it. Had to switch arguments of ReleaseDC() for some reason, but after that, the program worked. Changed the second argument of SetPixelFormat() to a pixel format integer id 3, which I knew had 8 stencil bits (from the OpenGL Extension Viewer). Then I tried glGetIntegervwith GL_STENCIL_BITS in the display() function the and wrote the value to a file. Still was 0. If I was able to control the pixel format and still get a 0 stencil bits value, I’m pretty sure it’s my laptop’s drivers.

(Also, my last post was a mistake. I ran the code on a “AMD Radeon HD 7570M/HD 7670M Graphics” graphics card instead of the Intel one. My laptop has switchable graphics. The AMD card produces a GL_INVALID_ENUM error, but the Intel card doesn’t.)

Anyways, thanks everyone for the help. I learned about glGetError() and pixel formats,. I got to try making a win32 opengl file work too. Never did that before, that was pretty cool.