Glx 1.3 and nvidia drivers

I don’t see an evident flaw. So here are some simple remarks :

  • it seems you noticed that the XSelectInput() call was un-necessary if you set up the event input mask via the window attributes

  • I guess you want a non blocking loop to render as fast as possible. I suggest you either remove any X event related code, or correct as following :

  • remove the XWindowEvent() call
  • use this loop :

while(1)
{
if (XPending(disp) > 0)
{
XEvent event;
XNextEvent (disp, &event);
switch (event.xany.type)
{

}
}
DrawGL();
glXSwapBuffers(disp, xWin);
}

Some explanation : the XCheckWindowEvent() does not remove the read event from the queue, so you’ll end up with an ever growing event queue. To poll for events, the XPending() is ok; then you call the XNextEvent() which is blocking, but won’t block since we know there’s at least an event in the queue. Then process any events. Of course, if there are no events, we just have a fast while(1){} loop (will eat up all your CPU+GPU time).

  • if you confirm your glClearColor is effectively drawn, you’re really close to success. Please animate your clear color, and make sure you can display a ‘flashing’ filled window : we’ll be sure the redraw is all right and we’re not getting lucky for the first frame

  • then you evidently have a pb with pure OpenGL functions : check your viewport, transformations, colors, zbuffering, and all the nasty switches that easily turns your output so daaaark

  • add a XFlush(disp) after the XMapWindow() call, you’ll be sure your first frame will render after the X server processed the mapping call (it is not blocking, just a hint to assume things happens synchronous sometimes)

  • the border_pixel attribute of the window is ignored by 99,9% of the X server/window manager combination, forget it

ok, i figured it out. it turns out, i was setting a some global variables for the x resolution, and the y resolution in the sdl code that i commented out, but i was using the variables to calculate the aspect ratio in gluPerspective, so the camera viewport was of size 0…

anyway, im glad i posted again, cause you pointed some stuff out that i probably would have figured out the hard (and long) way otherwise…

thanks for your help

Originally posted by zerodeux:
[b]I had a quick look : using glXGetConfig() and querying the GLX_SAMPLE_BUFFERS_ARB and GLX_SAMPLES_ARB attributes, I have 12 out of my 42 available visuals that expose ‘1 buffer and 2 samples’ (others report 0 and 0).

Really… Very strange. I tried the same and got nothing. More precisely, some visuals reported 0 and 0, and others failed on glXGetConfig returning GLX_BAD_VISUAL (‘not known by GLX’ )

Are you using the latest nVidia drivers? (2313) And what XFree86 version?

And to your questions, the glEnable is not even necessary. And to ‘Quincunx’ is correctly exposed as a
glHint(GL_MULTISAMPLE_FILTER_HINT_NV)
in a 2 sample visual.
(the so called ‘gaussian’ multisample mode is the same glHint in a 4 sample visual).

And personally, I don’t like them so much, they blur internal polygon pixels (textures). Sample re-use… There Aint No Such Thing As A Free Lunch…

Are you using the latest nVidia drivers? (2313) And what XFree86 version?

I’m using latest drivers (2313) with XFree86 4.1.0, kernel 2.4.16 and the hardware is a GeForce3 from NVidia’s (developer retail).

This is disturbing : using ‘glxinfo -t’, I read all 42 visuals with the two last columns set to 0 (multisample: samples count and buffer count). However my code using glXGetConfig() does return valid values for GLX_SAMPLE_BUFFER_ARB and GLX_SAMPLES_ARB (1 and 2 respectively). I can extract the corresponding code and give some output if needed.

Originally posted by zerodeux:

I can extract the corresponding code and give some output if needed.

Yes, that would be helpfull.
Both the code snip and the output (like the visual IDs that you get)
My code does this to get all the visuals:
XVisualInfo *vi = XGetVisualInfo(dpy,0,&tmp,&n);
Then for each element, a bunch of glXGetConfig’s for tokens like
GLX_RED_SIZE, etc. and the ones of interest: GLX_SAMPLES_ARB and GLX_SAMPLE_BUFFERS_ARB

I get visual IDs 0x21 through 0x4a (total of 42), and 0x2c through 0x36 and 0x40 through 0x4a give GLX_BAD_VISUAL return of glXGetConfig

I use XFree86 4.0.3, not newer, but the log says about the nvidia modules (glx, nvidia_drv) "compiled for 4.0.2, so I wasn’t worried about my XFree86 version.
Also, my X depth is 24, perhaps we have a difference there?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.