What is the error with my glGetError ?

I have problems reg. texturing some polygons. So I searched for an openGL-error using the method glGetError.

The return value is allways (from the start of the program) “GL_INVALID_OPERATION”. When I try do clean up the error-queue with

“while (GL_NO_ERROR != glGetError() ) {}”

I run into a deadlook. Is it not possible to clean the error messages ?

If I perform a valid operation like

“glGetString(GL_VERSION);
int aha=glGetError();”

should aha return “GL_NO_ERROR”, or not ?

MERCI ULLE

Make sure you have an active rendering context when calling glGetError (i.e. call wglMakeCurrent if necessary). If you don’t, you’ll always get GL_INVALID_OPERATION.

Regards.

Eric

P.S.: de rien !

And glGetError does not queue error codes. When an error is set, no new errors are set/queued before you have cleared the error (with glGetError). If you call glGetError two times in a row, the second call will always return GL_NO_ERROR (assuming you have a valid redering context).

Bob,

Reading the specs for this function in the MSDN, I am not sure you are correct:

"The glGetError function returns the value of the error flag. Each detectable error is assigned a numeric code and symbolic name. When an error occurs, the error flag is set to the appropriate error code value. No other errors are recorded until glGetError is called, the error code is returned, and the flag is reset to GL_NO_ERROR. If a call to glGetError returns GL_NO_ERROR, there has been no detectable error since the last call to glGetError, or since OpenGL was initialized.

To allow for distributed implementations, there may be several error flags. If any single error flag has recorded an error, the value of that flag is returned and that flag is reset to GL_NO_ERROR when glGetError is called. If more than one flag has recorded an error, glGetError returns and clears an arbitrary error flag value. If all error flags are to be reset, you should always call glGetError in a loop until it returns GL_NO_ERROR.

Initially, all error flags are set to GL_NO_ERROR."

I must say I test glGetError in a loop like this:

void TestGLError(CString ProcName)
{

if (wglGetCurrentContext()==NULL)
return;
int iModelviewStackDepth,iMaxModelviewStackDepth;
int iProjectionStackDepth,iMaxProjectionStackDepth;
int iTextureStackDepth,iMaxTextureStackDepth;
int iNameStackDepth,iMaxNameStackDepth;
CString AllErrors=“”,strError;
GLenum GL_Error;
do
{
GL_Error=glGetError();
switch (GL_Error)
{
case GL_NO_ERROR:
break;
case GL_INVALID_ENUM:
AllErrors+="GL_INVALID_ENUM
";
break;
case GL_INVALID_VALUE:
AllErrors+="GL_INVALID_VALUE
";
break;
case GL_INVALID_OPERATION:
AllErrors+="GL_INVALID_OPERATION
";
break;
case GL_STACK_OVERFLOW:
AllErrors+="GL_STACK_OVERFLOW
“;
glGetIntegerv(GL_MODELVIEW_STACK_DEPTH,&iModelviewStackDepth);
glGetIntegerv(GL_MAX_MODELVIEW_STACK_DEPTH,&iMaxModelviewStackDepth);
glGetIntegerv(GL_PROJECTION_STACK_DEPTH,&iProjectionStackDepth);
glGetIntegerv(GL_MAX_PROJECTION_STACK_DEPTH,&iMaxProjectionStackDepth);
glGetIntegerv(GL_TEXTURE_STACK_DEPTH,&iTextureStackDepth);
glGetIntegerv(GL_MAX_TEXTURE_STACK_DEPTH,&iMaxTextureStackDepth);
glGetIntegerv(GL_NAME_STACK_DEPTH,&iNameStackDepth);
glGetIntegerv(GL_MAX_NAME_STACK_DEPTH,&iMaxNameStackDepth);
strError.Format(“Modelview Stack Depth : %i / %i”,iModelviewStackDepth,iMaxModelviewStackDepth);
AllErrors+=strError+”
“;
strError.Format(“Projection Stack Depth : %i / %i”,iProjectionStackDepth,iMaxProjectionStackDepth);
AllErrors+=strError+”
“;
strError.Format(“Texture Stack Depth : %i / %i”,iTextureStackDepth,iMaxTextureStackDepth);
AllErrors+=strError+”
“;
strError.Format(“Name Stack Depth : %i / %i”,iNameStackDepth,iMaxNameStackDepth);
AllErrors+=strError+”
";
break;
case GL_STACK_UNDERFLOW:
AllErrors+="GL_STACK_UNDERFLOW
“;
glGetIntegerv(GL_MODELVIEW_STACK_DEPTH,&iModelviewStackDepth);
glGetIntegerv(GL_MAX_MODELVIEW_STACK_DEPTH,&iMaxModelviewStackDepth);
glGetIntegerv(GL_PROJECTION_STACK_DEPTH,&iProjectionStackDepth);
glGetIntegerv(GL_MAX_PROJECTION_STACK_DEPTH,&iMaxProjectionStackDepth);
glGetIntegerv(GL_TEXTURE_STACK_DEPTH,&iTextureStackDepth);
glGetIntegerv(GL_MAX_TEXTURE_STACK_DEPTH,&iMaxTextureStackDepth);
glGetIntegerv(GL_NAME_STACK_DEPTH,&iNameStackDepth);
glGetIntegerv(GL_MAX_NAME_STACK_DEPTH,&iMaxNameStackDepth);
strError.Format(“Modelview Stack Depth : %i / %i”,iModelviewStackDepth,iMaxModelviewStackDepth);
AllErrors+=strError+”
“;
strError.Format(“Projection Stack Depth : %i / %i”,iProjectionStackDepth,iMaxProjectionStackDepth);
AllErrors+=strError+”
“;
strError.Format(“Texture Stack Depth : %i / %i”,iTextureStackDepth,iMaxTextureStackDepth);
AllErrors+=strError+”
“;
strError.Format(“Name Stack Depth : %i / %i”,iNameStackDepth,iMaxNameStackDepth);
AllErrors+=strError+”
";
break;
case GL_OUT_OF_MEMORY:
AllErrors+="GL_OUT_OF_MEMORY
";
break;
default:
AllErrors+=“Unknown GL_ERROR
“;
break;
}
}
while (GL_Error!=GL_NO_ERROR);
if (AllErrors!=””)
AfxMessageBox("Procedure : “+ProcName+”
"+AllErrors);

And I think it happened that I got two errors in a row (i.e. in a single loop).

I am not actually certain about that…

Regards.

Eric

reading the 1.3.1 spec

Seems like you are right. I read the spec about errors some time ago, but obviously I didn’t read all of it.

And of course I meant the 1.3 spec, not 1.3.1