Lev, no problem !
to all,
i’ve tried many things yesterday evening.
First, i was false on calling glGetString(), it was before glutCreateWindows, so the context wasn’t created when
calling. that’s why it gaves me a null string.
so, now glGetString( GL_EXTENSIONS); gives me many thing: it supports ARB, NV, EXT, SGIS, IBM, KTX
(under Linux). Of course, vertex_array_range, vertex_array_range2, vertex_array_program and
draw_range_element. This was under glut.
Under glx (i’ve tried a glx demo program), no array, no draw extensions are supported (maybe because it uses
Mesa).
So, i understand less why it doesn’t work.
I have tried all the values with all combinations for x,y,z in glXAllocateMemoryNV( 1000*sizeof( GLfloat), x,y,z);
It didn’t work.
Concerning glGetString( GL_RENDERER);
under glut, it returns GeForce2 MX/PCI/3DNOW!
under glx, it returns Mesa X11
do you think i have to recompile glut ? my version is the originale provided by Mandrake (so, uses Mesa).
When trying to allocate under the glxdemo program, it stops with a segmentation fault in the
glXAllocateMemoryNV link with glGetProcAddressARB.
any ideas are welcome.
thanks
JD