A question about glReadPixels() in Windows

I am using to use glReadPixels() in Visual Studio. It works fine in the project which uses glut window, but in another project which uses MFC to display, it can’t read the pixels correctly. Anyone can give me some hint? I totally have no idea. I am not familiar with glReadPixels(). Please see the code below.

void display(void)
icVector3 up, ray, view;
int total_bits;

// mapping from 0, center.z+3R to 0 and 1

glOrtho(-Object.radius1.0, Object.radius1.0, -Object.radius1.0, Object.radius1.0, vd_n, vd_f); // 0.0, Object.center.entry[2]+3.0Object.radius);
view = Object.center-Object.radius
ray = Object.center - view;

up.set(1.0, 0.0, 0.0);
if (fabs(dot(ray, up))>0.707) 
	up.set(0.0, 1.0, 0.0);

//total_bits = int(log(Object.nverts)/log(2)+0.5);
total_bits = int(log((double)Object.nverts)/log(2*1.0)+0.5);

gluLookAt(view.entry[0], view.entry[1], view.entry[2], Object.center.entry[0], Object.center.entry[1], Object.center.entry[2], up.entry[0], up.entry[1], up.entry[2]);

glGetDoublev(GL_MODELVIEW_MATRIX, modelview_matrix);
glGetDoublev(GL_PROJECTION_MATRIX, projection_matrix);
glGetIntegerv(GL_VIEWPORT, viewport);

// glCallList(theObject);
unsigned char a, b, c;
unsigned int i, j;
Face *face;
int *verts;

fprintf(stderr, "in

for (i=0; i<20; i++) { // Object.nfaces; i++) {
if (i>=Object.nfaces)
a = i >> 16;
b = (i & 0xFF00) >> 8;
c = i & 0xFF;
glColor3ub(255-a, 255-b, 255-c);
face = Object.flist[i];
verts = face->verts;
for (j=0; j<face->nverts; j++) {
glVertex3d(Object.vlist[verts[j]]->x, Object.vlist[verts[j]]->y, Object.vlist[verts[j]]->z);
fprintf(stderr, "out

glReadPixels(0, 0, win_width, win_height, GL_RGB, GL_UNSIGNED_BYTE, pixels);
glReadPixels(0, 0, win_width, win_height, GL_DEPTH_COMPONENT, GL_FLOAT, depths);

static int counterMARK = 0;
char buffer[128];
sprintf(buffer,"mark%0.4i.ppm", counterMARK++);
SSS_SavePPM(win_width, win_height, pixels, buffer);

// glDrawPixels(win_width, win_height, GL_RGB, GL_UNSIGNED_BYTE, pixels);

Thank you for any help!!!

Can you explain what you mean by “it can’t read the pixels correctly” please? What is happening? What does glGetError say? From your commenting out of your “glutSwapBuffers” call and use of glFlush/glFinish at the end of your display routine, I’m guessing that you have a single-buffered context, but yet you’re setting your glReadBuffer to GL_BACK. But without further info a guess is the best you’re going to get.

OpenGL does not depend on factors such as OS, compiler or windowing framework used; it’s all in the driver, video RAM and system memory buffers, so the same code should work in the same way irrespective.

Thank u for your help, mhagain.

By saying “it can’t read the pixels correctly”, I mean it gets zero for all the pixel value in the MFC program, which is not supposed. While in the glut based program, glReadPixels() return the pixel results correctly. “glutSwapBuffers” call and use of glFlush/glFinish is commented because I am using the MFC interface to display. could you please give me some further help? Thank you.

What does glGetIntegerv(GL_DEPTH_BITS, &bits) return?

Have you tried not changing glReadBuffer? In a single buffered context you’ve only got one buffer (front-left, but see the documentation), so that call shouldn’t have effect (beyond setting an error), but all the same, drivers have been known to behave strangely. The default read buffer should be whatever is appropriate for your GL context, so comment out that call and see what happens.

Also remember that glReadPixels is not guaranteed to return valid data if the frame buffer itself is (partially) off-screen or obscured by other windows. Only if you can see the region on your monitor, you will get that data. All other cases, the result may be undefined. On some platforms/hardware/drivers you may get valid off-screen data, but I would not rely on that behavior. Keyword: “pixel ownership test”.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.