Difficulty in reading values from depth buffer

Hello,

I have written a program to render a triangulated 3D face model and save it as an image. My application also requires me to comoute the exact 3D coordinates and surface normals of the point which (by orthographic projection) produced every fore-ground pixel in the image. I have a way of throwing out pixels in the image that correspond to the background instead of the actual 3d face model (I know that the background pixels are colored (0,0,0)).

Now, I use glReadPixels to read the RGB values of the rendered 3D model and then store them in an image. The image is all fine. However when I try to use glReadPixels to obtain the depth values, I get garbage. Could someone help me with this? What could be going wrong? Here is a snippet of my code:

DrawModel(); // render a 3D model

glGetDoublev(GL_MODELVIEW_MATRIX,modelMatrix);
glGetDoublev(GL_PROJECTION_MATRIX,projMatrix);
glGetIntegerv(GL_VIEWPORT,viewport);

for (i=0;i<nr;i++)/nr=number of rows (in image)/
{
for (j=0;j<nc;j++)/nc=number of columns (in image)/
{
val = (RGB[(i+j*width)3] + RGB[(i+jwidth)3 +1]+RGB[(i+jwidth)*3 + 2])/3;

/* black pixels belong to the background, so ignore them. /
if ((int)val != 0)
{
/
read the depth value corresp. to pixel (j,nr-i-1) /
glReadPixels(j,nr-1-i,1,1,GL_DEPTH_COMPONENT,GL_FLOAT,&Z1);
/
get the coordinate of the point on the 3D model that produced this pixel */
if(gluUnProject(j,nr-1-i,Z1,modelMatrix,projMatrix,viewport,&objX,&objY,&objZ)== GL_TRUE) /*3D points stored in objX,objY,objZ */
{
Write (objX, objY, objZ) to a file.
} // close if loop
} // close for loop
} // close for loop

Here is the code to render the 3D model:

DrawModel
{
glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glViewport(0,0,width,height);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

gluLookAt (0cos(theta)+115sin(theta), 0.0, 0sin(theta)+115.0cos(theta), 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);
glMatrixMode (GL_PROJECTION);
glLoadIdentity();
glOrtho(-200,200,-200,200,300,-300);

glBegin(GL_TRIANGLES);

Render the triangular faces of the model, using normals for lighting.

glEnd();

RGB = (unsigned char *) malloc(width * height * sizeof(unsigned char) * 3);

/* buffer to store color values /
glReadPixels(0,0,width,height,GL_RGB,GL_UNSIGNED_BYTE,(void
)RGB);

// Write to a pgm file.

glutSwapBuffers();
}

As a first comment, it seems that you use glutSwapBuffers & afterwards you try to read the buffer (which is probably not the buffer you want). I guess this one is guilty enough.

I think you inverted i and j when you read val.

Hello,

Thanks a lot for your response, and you have indeed raised a valid point. So here’s what I tried: Immediately after reading the color values (inside the DrawModel function), I also read the depth values in an appropriately sized memory block using glReadPixels. I then print those depth values to a file, and I still see complete garbage. So there seems to be some issue with the way I am using glReadPixels for the purpose of obtaining depth values.

DrawModel ()
{
… earlier code as is

RGB = (unsigned char ) malloc(width * height * sizeof(unsigned char) * 3);
/
buffer to store color values /
glReadPixels(0,0,width,height,GL_RGB,GL_UNSIGNED_BYTE,(void
)RGB);

// Write to a pgm file.

Z1 = (GLfloat*) malloc(width * height * sizeof(GLfloat));
glReadPixels(0,0,width,height,GL_DEPTH_COMPONENT,GL_FLOAT,(void*)Z1);

// print the values in Z1 to a file

glutSwapBuffers();
}

By garbage, what do you mean? what’s the range of the values you are getting?

i know that the range of Z values in the 3D model cannot for instance lie beyond +/- 20,000. If I get really huge numbers, I suspect that something is fishy.

The depth buffer values you should be getting, are normally in the range 0…1. What is your whole render function?

Also, a very wild guess based on your posted pseudocode, there could be the case of doing sth wrong with the pgm file writer and messing with gl, such as changing the rendering context.

Also, one more very important thing. Do you check with glError()?

Do I check with glGetError()? Yes, indeed I did do that at four places in my code: (1) after the glBegin…glEnd loop to render the model, (2) after glReadPixels to read the color values, (3) after after glReadPixels to read the color values, and (4) after every gluUnproject operation. There was no error reported (rather I got a flag of GL_NO_ERROR).

About the pgm file writer - I’ll look into the code and check to see if there is some error there.

Thanks a lot!

Okay, I have checked the pgm code as well, and its fine. So I think the error lies in the way I am using glReadPixels to obtain the depth component, and really, I get huge numbers, NaNs and what not.

Your glReadPixels is fine and should work as is.

Make sure you use the following for pixel storage :


glPixelStorei (GL_UNPACK_SWAP_BYTES , 0);

I also use the following just-to-be-sure, but I don’t think they matter for your case.


glPixelStorei (GL_UNPACK_ALIGNMENT , 1);
glPixelStorei (GL_UNPACK_SKIP_ROWS , 0);
glPixelStorei (GL_UNPACK_SKIP_PIXELS , 0);
glPixelStorei (GL_UNPACK_ROW_LENGTH , 0);
glPixelStorei (GL_PACK_ALIGNMENT , 1);

Also, for gl error checking, try & use sqrt[-1]'s glIntercept which is a very convenient tool that eliminates your need putting glError() checks everywhere, besides other loads of useful stuff :slight_smile: