I’ve been setting up an FBO with following texture attachments
color0 = RGBA 8-bit
color1-3 = none
depth = DEPTH_COMPONENT24
stencil = none
I get GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT as FBO error, eventhough the textures are all equal width/height (512)
when changing the depthtexture to DEPTH_COMPONENT16, it worked and no error was thrown.
Is this my fault, or drivers
ATI 9600 mobile, which doesnt support DEPTH24 textures, but didnt complain on texture creation, and I assumed they would simply pick the next best format to use
Do you make a render buffer for the depth or a texture?
My code starts by trying to create a RGBA8 + DEPTH=32, STENCIL=0 and it succeeds.
With older drivers, I think neiher 32 or 24 depth worked. Only 16.
all texture attachments, no renderbuffers used.
I had 24bit depth textures working by using GL_EXT_packed_depth_stencil/GL_NV_packed_depth_stencil. It has 24bits depth and 8 bits stencil.
yes those work as well if hardware allows, I just wondered why it would return INCOMPLETE_DIMENSIONS, when sizes matched. I suspect this to be driver bug, as I haven’t gotten it on geforce8 & 6 hardware. But I wanted to make sure I haven’t “missed” something about INCOMPLETE_DIMENSIONS.
Dis you retrieve the actual depth texture states (width, height,etc…) after its creation? Do they match what you requested?
“I just wondered why it would return INCOMPLETE_DIMENSIONS, when sizes matched” sound like they simply used an error value not really descriptive for the problem.
My first reaction is that UNSUPPORTED would have been a better error value to use (to try to communicate the depth-buffer request couldn’t be satisfied), but then OpenGL hasn’t exactly got a history of descriptive error reporting. It didn’t matter that much in the early 1.x days, but with FBO it became quite clear the error reporting mechanism is … displaying its age.
Which reminds me: ARB; are there new (and more precise) errors defined for 3.x?
Looks like it is actually spitting out a GL_INVALID_OPERATION on the call to
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, TextureID, 0);
and then frame buffer check gives GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT
so I guess on ATI drivers do not like when we try to bind a depth texture. I tried with 32, 24 and 16 bit depth.
I just used DEPTH_COMPONENT and let it pick something itself (catalyst 7.9, haven’t checked latest) and that worked.
What are all your calls to make the color and depth texture and the FBO?