The diffrence in OGL on Linux and win

Ah, I see you are using GLUT. At least w.r.t. GLUT and GLX Winex should be redundant since GLUT is available for both platforms, just recompile and it should work.

For GLUT the window attributes are defined in the glutInitDisplayMode call. GLUT_DEPTH seems to be the token, but there is no direct control over the depth buffer size. This seems like a weakness of that API.

What you get will depend on the details of the code inside GLUT that creates the visual attributes on each native platform. The alternative is to go with another approach to windowing. Ofcourse I don’t actually know what GLUT does w.r.t. bits requested, I haven’t looked at the code.
You are still calling through winex for one platform which might create the issue reguardless of what you decide to do.
http://www.opengl.org/developers/documentation/glut/spec3/node12.html#SECTION00033000000000000000

It also occurred to me that your available visuals and therefore depth buffer quality may be affected by your desktop mode, for example your color quality in the display settings dialog in windows and the /etc/X11/XFree86-4 configuration file on Linux.

I know that some hardware links depth buffer size with color depth and color depth to desktop depth and can therefore be determined by the display properties. Something like a 16 bit X desktop instead of 24 bit may be causing your problem, and you simply have a 32 bit (or true color) desktop setting on Windows instead of a 16 bit.

[This message has been edited by dorbie (edited 02-19-2003).]

Originally posted by dorbie:
It also occurred to me that your available visuals and therefore depth buffer quality may be affected by your desktop mode…

Yes, that was what I hinted earlier too. nukem never answered what color depth he was using. I have noticed that many Linux users seem to be using a 16 BPP desktop, while most Windows users have 24/32 BPP. I don’t know why this is - is it a default setting for most common Linux distros?

Sorry I thought I did tell you what my color mode was, its 24bit I tryed all the other color modes and resolutions as well.

dorbie: I only used winex once to see how well my program worked on win. Right now im really trying to get it working flawlessly on linux. It maybe something with my Z buffer but how would I make that bigger and then clear it?

marcus256: I put your code in before and after(it should be after right?) All I see is a black triangle on the bottom right of the window. Im going to play with the values a bit.

I see that in my init funtion I have gluPerspective(50, 1, 0.1, 10); could this be screwing me up? When I take it out nothing happens.

Take a look at the source code.

nukem, I’ve already suggested how in my posts, you’ll have to use native calls as I described or edit the native calls in the GLUT source code.

One thing you could spend more time on is making that glGet call work. It should work, that it doesn’t on both platforms is very strange, I posted some comments earlier on this. Seeing a sane value from glGet would give you the confidence to agressively pursue some of these options I think.

dorbie: Sorry I must of misunderstood you I thought because Im using glut I cannt use those calls.

Interesting, I added this code

glutInitDisplayMode(GLUT_DEPTH);
int bitcount;
glGetIntegerv(GL_DEPTH_BITS, &bitcount);
cout << bitcount << endl;

before my program drew anything and the bitcount is 0 still. Im trying to play with glXChooseVisual call you mentioned but dpy is giving me problems no matter how I declare that it gives me an error and there are very few docs on it this is all I found. http://www.opengl.org/developers/documen…isual#first_hit

[This message has been edited by nukem (edited 02-19-2003).]

Originally posted by nukem:
…before my program drew anything and the bitcount is 0 still.

Note that glutInitDisplayMode does not actually create your window, so you have no GL context yet. You need to open the window first (glutCreateWindow).

Originally posted by nukem:
Take a look at the source code.

Thanks I downloaded it & compiled it, and nothing. I looked at it, and made a few changes, and now it works. I even have animation (but it looks really strange!)

Key fixes:

  1. Remove “scale = new float” and “x = y = z = new float” from Resize()
  2. Remove matrix setup (projection & modelview) from Init() - it’s redundant code
  3. Replace display() with this code:
void display()
{
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	float scale;
	float x;
	float y;
	float z;

	mdl.Resize(&scale, &x, &y, &z);

	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	gluPerspective(55.0, (GLdouble)width/(GLdouble)height, 2.0, 6.0);

	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
	gluLookAt(0.0, -4.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0);

	glRotatef(rotx, 0, 1, 0);
	glRotatef(roty, 1, 0, 0);

	glScalef(scale, scale, scale);
	glTranslatef(x, y, z);

	glEnable(GL_DEPTH_TEST);
	glCullFace(GL_FRONT);

	DisplayMDL();

	glutSwapBuffers();
	glutPostRedisplay();
}

Note how I changed the order of the transformations, and removed the Push/Pop calls, since they do nothing good (you reinitialize the matrix every time anyway).

I also added width & height as global variables (int), and changed the reshape function to the following:

void reshape(int w, int h)
{
	width = w;
	height = h > 0 ? h : 1;
	glViewport(0, 0, w, h);
}

This way you can change the window size, and still get a properly sized model.

BTW, you can get the current time (in milliseconds) with glutGet( GLUT_ELAPSED_TIME ).

marcus256: I made all the changes you suggested. I removed scale = new float and x = y = z = new float from Resize() I took out glMatrixMode(GL_PROJECTION); and glMatrixMode(GL_MODELVIEW); from Init() I put in your display function and your reshape function. It gave me this. I looked at my Z buffer and its still 0. Do you think its the Z buffer as dorbie said? If so how would I fix this I tryed the glutInitDisplayMode and glXChooseVisual(I cannt get this one to work because dpy wont declare for some reason) and they both didnt change anything. What shall I do now?

Thanks,
Nuke

[This message has been edited by nukem (edited 02-20-2003).]

I get the exact same view, but the quality is better (no Z buffer errors). I run this under Windows NT4 with an Intel 4MB “accelerator” (in 16 BPP mode, by the way).

Try inderting these lines right after glutCreateWindow:

GLint bits;
glGetIntegerv( GL_DEPTH_BITS, &bits );
printf( "Z buffer precision: %d bits/pixel

", bits );

I get 16 bits with my config.

Uh, did you change your glutInitDisplayMode to:

glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);

???

On a side not, a few bug fixes:

  1. If argc != 2, do return 0; (I get a crash otherwise)
  2. glutKeyboardFunc does not give you GLUT_KEY_UP etc. Use glutSpecialFunc for those keys.
  3. Don’t update translation, but rather rotation (rotx & roty) with the cursor keys - it’s more interesting

marcus256: thank you soooooooooooooooooo much it finally works now!!! The problem was that I didnt have GLUT_DEPTH in glutInitDisplayMode. I would like to include your name in the source for helping me, post it here or e-mail me. If youd like to look at the final source look here.

Thanks agien!!!

Lee

[This message has been edited by nukem (edited 02-20-2003).]

Nukem, when people are trying to help you you should at least put the effort in to read their posts and compare with your code. Your passive development approach is not a good way to write software.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.