glGenBuffersARB problem

hi folks,

i’m trying to get into using VBOs and experienced some problems from the very beginning- starting with glGenBuffersARB. when i run the following code, the initial value of array_id remains unchanged, so GenBuffers doesn’t seem to work properly. anybody else encountered this problem?

i’m working on redhat linux with a nvidia quadro 900 xgl, driver 43.63

  
typedef void ( *PFNGLGENBUFFERSARBPROC) (GLsizei n, GLuint *buffers);

PFNGLGENBUFFERSARBPROC glGenBuffersARB;

uint    array_id = 123456;

void init() {

 if((glGenBuffersARB = (PFNGLGENBUFFERSARBPROC)glXGetProcAddressARB((const GLubyte*)"glGenBuffersARB")) == NULL)
     printf("	GEN BUFFERS ARB NOT FOUND
");

 glGenBuffersARB(1, &array_id);
 printf("
	bufid = %i
", array_id); }

You need an active rendering context at the time you generate buffer IDs, or rather, do anything with OpenGL.

sorry- i HAVE a current context, which of course you couldn’t see in the code snippet…actually i’m sure that i have everything i need to get it working- that’s why i put this in the advanced and not in the beginners’ forum :smiley:

if you take a look at this- i’ve tried it on redhat/nvidia quadro at work and on suse 9.0 with geforce4mx440/driver version 61.11 at home. it just doesn’t work…the values in vbuf do not change

 #define        GLX_GLXEXT_PROTOTYPES   0xffff

 #include<stdio.h>
 #include<X11/Xlib.h>
 #include<GL/glx.h>

 Display                 *dpy    = XOpenDisplay(NULL);
 Window                  root    = DefaultRootWindow(dpy);
 GLint                   att[]   = {GLX_RGBA, None};
 XVisualInfo             *vi     = glXChooseVisual(dpy, 0, att);
 GLXContext              glc     = glXCreateContext(dpy, vi, NULL, False);
 Visual                  *vis    = DefaultVisual(dpy, 0);
 Colormap                cmap    = XCreateColormap(dpy, root, vis, AllocNone);
 int                     dep     = DefaultDepth(dpy, 0);
 XSetWindowAttributes    swa;
 Window                  win;
 GLuint                 vbuf[]  = {98, 99, 100};

 typedef void (APIENTRY * PFNGLGENBUFFERSARBPROC) (GLsizei , GLuint *);

 PFNGLGENBUFFERSARBPROC         glGenBuffersARB = NULL;

int main(int argc, char *argv[]){
 swa.colormap           = cmap;
 swa.border_pixel       = 0;

 win = XCreateWindow(dpy, root, 0, 0, 100, 100, 0, dep, InputOutput, vis, CWColormap | CWBorderPixel, &swa);
 XMapWindow(dpy, win);
 glXMakeCurrent(dpy, win, glc);

 glGenBuffersARB = (PFNGLGENBUFFERSARBPROC) glXGetProcAddressARB((const GLubyte*) "glGenBuffersARB");
 glGenBuffersARB(3, &(vbuf[0]));
 printf("
	BUFFER IDS = %i, %i, %i

", vbuf[0], vbuf[1], vbuf[2]); }

So everything else is working except VBO?

Try

int main(int argc, char *argv[])
{


GLuint bufferID=0;
glGenBuffersARB(1, &bufferID);


}

well- seems that now i’ve woken up canada :smiley:

but if you look at my first code sample- it looks pretty much like your proposal. believe me, i’ve tried uint, GLuint and arrays of both. but it still doesn’t work. i have a current gl context. i set the pointer for glGenBuffersARB to NULL, so if glXGetProcAdressARB would not work, the call to GenBuffers should result in core dump, but it doesn’t…

WHAT THE HELL AM I DOING WRONG???

this is not a fake- can anybody compile this stuff on their linux machine and tell me that it works- then i’ll gladly throw my hardware out the window and buy something new <sob>

I think what V-man is getting at is initialising the value of your variable to “0” before you call glGenBuffers(). It might be that glGenBuffers() is using the value you pass it as the ID. Have you actually called glGetError() after calling glGenBuffers() to see if there is actually an error?

Each time i’ve seen this kind of error it was due to the context. Double, triple check that your context is correct.

Y.

i create a window and a context, make it current, draw a quad, no problem. so the context is good for me.

As a side note, on Linux it is illegal to name your function pointer the same as the function. You wouldn’t put ‘void * (*malloc)(size_t);’ in a program and expect things to be happy, would you? glGetBuffersARB is exactly the same.

? i don’t get your point- there’s no glGenBuffersARB defined in any of the header files.

anyway, there’s no difference if i call my function glGenBuffersNarf…

I’ll ask it again:

Have you actually called glGetError() after calling glGenBuffers() to see if there is actually an error?

…GL_NO_ERROR…

And if you try to use the buffers (or rather than values in the buffers) what happens?

It doesn’t matter if glGenBuffersARB or glFooBar are in the header files. Symbols that are part of the public OpenGL API may be exported by libGL. If a symbol is publicly exported by a library, replacing it with a symbol with the same name but different type (e.g., function vs. function pointer) can cause problems.

The problem is if libGL or the loaded driver tries to call the function you’re overloading. If the driver calls glGenBuffersARB (unlikely!), it will jump to your function pointer and your program will crash. Just like if you overload malloc with ‘void * (*malloc)(size_t)’. Something in another library will try to call malloc, jump to the pointer, and crash.

However, other libraries that you load might do that. For example, a future version of GLUT (or FreeGLUT) could, legally directly call glMultiTexCoord2D if GL version 1.3 was detected. If you have a function pointer called glMultiTexCoord2D in your app, BOOM.

Note: this has nothing to do with the problem you’re seeing here. That’s why I prefaced my original comment with “As a side note…”

ok, here’s the complete code. what happens is a segmentation fault when glDrawarrays is called

  

 #define	GLX_GLXEXT_PROTOTYPES	0xffff

 #include<stdio.h>
 #include<X11/Xlib.h>
 #include<GL/glx.h>

 Display *dpy = XOpenDisplay(NULL);
 Window root = DefaultRootWindow(dpy);
 GLint	 att[] = {GLX_RGBA, None};
 XVisualInfo *vi = glXChooseVisual(dpy, 0, att);
 GLXContext glc = glXCreateContext(dpy, vi, NULL, False);
 Visual *vis = DefaultVisual(dpy, 0);
 Colormap cmap = XCreateColormap(dpy, root, vis, AllocNone);
 int dep = DefaultDepth(dpy, 0);
 XSetWindowAttributes swa;
 Window	win;
 GLuint	vbuf[] 	= {98, 99, 100};

 #define GL_ARRAY_BUFFER_ARB 0x8892
 #define GL_STATIC_DRAW_ARB 0x88E4

 typedef void (APIENTRY * PFNGLBINDBUFFERARBPROC) (GLenum target, GLuint buffer);
 typedef void (APIENTRY * PFNGLDELETEBUFFERSARBPROC) (GLsizei n, const GLuint *buffers);
 typedef void (APIENTRY * PFNGLGENBUFFERSARBPROC) (GLsizei n, GLuint *buffers);
 typedef void (APIENTRY * PFNGLBUFFERDATAARBPROC) (GLenum target, int size, const GLvoid *data, GLenum usage);
 typedef bool (APIENTRY * PFNGLISBUFFERARBPROC) (GLuint);

 PFNGLGENBUFFERSARBPROC 	glGenBuffersARB 	= NULL;
 PFNGLDELETEBUFFERSARBPROC	glDeleteBuffersARB 	= NULL;
 PFNGLBINDBUFFERARBPROC		glBindBufferARB		= NULL;
 PFNGLBUFFERDATAARBPROC		glBufferDataARB		= NULL;
 PFNGLISBUFFERARBPROC		glIsBufferARB		= NULL;

 GLfloat	vbo_data[] = {0., 0., 0., 1., 0., 0., 0., 1., 0.};

int main(int argc, char *argv[]){
 swa.colormap           = cmap;
 swa.border_pixel       = 0;

 win = XCreateWindow(dpy, root, 0, 0, 100, 100, 0, dep, InputOutput, vis, CWColormap | CWBorderPixel, &swa);
 XMapWindow(dpy, win);
 glXMakeCurrent(dpy, win, glc);

 glGenBuffersARB 	= (PFNGLGENBUFFERSARBPROC)    glXGetProcAddressARB((const GLubyte*) "glGenBuffersARB");
 glDeleteBuffersARB 	= (PFNGLDELETEBUFFERSARBPROC) glXGetProcAddressARB((const GLubyte*) "glDeleteBuffersARB");
 glBindBufferARB 	= (PFNGLBINDBUFFERARBPROC)    glXGetProcAddressARB((const GLubyte*) "glBindBufferARB");
 glBufferDataARB 	= (PFNGLBUFFERDATAARBPROC)    glXGetProcAddressARB((const GLubyte*) "glBufferDataARB");
 glIsBufferARB 		= (PFNGLISBUFFERARBPROC)      glXGetProcAddressARB((const GLubyte*) "glIsBufferARB");

 if(glGenBuffersARB && glDeleteBuffersARB && glBindBufferARB && glBufferDataARB && glIsBufferARB)
	printf("
	VBO SETUP OK
");

 glGenBuffersARB(3, vbuf);
 printf("
	BUFFER IDS = %i, %i, %i

", vbuf[0], vbuf[1], vbuf[2]); 
 printf("	GEN BUFFER     : ERROR = %i
", glGetError());

 if(glIsBufferARB(vbuf[0]) == False)
	printf("
	BUFFER NOT VALID

");

 glBindBufferARB(GL_ARRAY_BUFFER_ARB, vbuf[0]);
 printf("	BIND BUFFER    : ERROR = %i
", glGetError());

 glBufferDataARB(GL_ARRAY_BUFFER_ARB, 9*sizeof(float), vbo_data, GL_STATIC_DRAW_ARB);
 printf("	BUFFER DATA    : ERROR = %i
", glGetError());

 glEnableClientState(GL_VERTEX_ARRAY); 
 printf("	ENABLE ARRAY   : ERROR = %i
", glGetError());

 glVertexPointer(3, GL_FLOAT, 0, NULL);
 printf("	VERTEX POINTER : ERROR = %i
", glGetError());

 glDrawArrays(GL_TRIANGLES, 0, 3); }
//
//	gcc -I/usr/X11R6/include -L/usr/X11R6/lib -o vbo vbo.cc -lX11 -lGL -lGLU -lm
//

and i should not forget to mention this: the glIsBufferARB call returns <false>.

GreetingsFromMunich,

did you ever sort this problem out? I’m getting exactly the same problem under linux with exactly the same card. (NV quadro 4 900XGL)

Ok. I’m trying to see why.

What does this command returns ?

glxinfo | grep ‘GL_ARB_vertex_buffer_object’

I see one problem with that code snippet. You don’t abort the program if one of the buffer extensions don’t work, it just tells you if it succeded and then continues. Check to see which one fails and then debug from that point on.

Aaron

There are a few problems.
Checking to make sure GL context is current.
Checking for the extension.

If those check out, then this would qualify as a genuine possible bug. It would be stupid if this simple thing slipped through QA.

Have you contacted Nvidia? What did they say?