Maybe a OpenGL Version Problem?

Hello,

i have strange Problems with OpenGL on older Graphic Cards with lower OpenGL Version as 2.0. First i will show you the code that i used to draw Images on a 2D Texture. I use Qt4.5 with the QGLWidget as Framework and Windows XP as OS.

The following Code works perfekt on newer versions of Graphic Cards like ATI or NVIDIA.

INIT:

initializeGL()
{
   // setup viewport, projection etc.:
   glViewport(0, 0, (GLint)width(), (GLint)height());
   glEnable(GL_TEXTURE_2D);
   glClearColor( 0.0f, 0.0f, 0.0f, 0.0f );
   glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);


   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
   glRotatef(180.0f, 180.0f,0.0f,0.0f); // Rotate

}

DRAW:

 glClearColor( 0.0f, 0.0f, 0.0f, 0.0f );

   glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);



   glTexImage2D(
      GL_TEXTURE_2D,
      0,
      GL_LUMINANCE,
      m_image.width(),
      m_image.height(),
      0, GL_LUMINANCE,
      GL_UNSIGNED_BYTE,
      m_image.bits());   

   

   glEnable(GL_TEXTURE_2D);
   glBegin(GL_QUADS);      // Start Drawing A Textured Quad
   glTexCoord2f(0.0f, 0.0f); glVertex3f(-1.0f, -1.0f, 0.0f); // Bottom Left
   glTexCoord2f(1.0f, 0.0f); glVertex3f( 1.0f, -1.0f, 0.0f); // Bottom Right
   glTexCoord2f(1.0f, 1.0f); glVertex3f( 1.0f, 1.0f, 0.0f);  // Top Right
   glTexCoord2f(0.0f, 1.0f); glVertex3f(-1.0f, 1.0f, 0.0f);  // Top Left
   glEnd();           // Done Drawing The Quad   
   glDisable(GL_TEXTURE_2D);

The same code on my Laptop with a intel graphic card and a lower version of OpenGL don´t work. The only thing i see is a white texture. I don´t have any idea why this only works on higher maschines. The Image is scaled to a power of 2.

It would be great if someone could help me.

Regards,

Treehouse

Have you tried to see if glGetError returns anything?

Perhaps use an OpenGL debugger like
http://glintercept.nutty.org/
?

glGetError gets these meassage: invalid enumerant

Treehouse

Ok i found the line that make a problem. I remove these line and now on my desktop PC it works without errors. But on my LAPTOP i get a “invalid value” error.

Treehouse

Ok now i have the function that fails on my laptop:

glTexImage2D(
		GL_TEXTURE_2D	,
		0,
		GL_LUMINANCE,
		m_image.width(),
		m_image.height(),
		0, GL_LUMINANCE,
		GL_UNSIGNED_BYTE,
		m_image.bits());	

This Function fails with: INVALID VALUE

http://www.opengl.org/documentation/specs/man_pages/hardcopy/GL/html/gl/teximage2d.html

GL_INVALID_VALUE is generated if width or height is less
than 0 or greater than 2 + GL_MAX_TEXTURE_SIZE, or if either
cannot be represented as 2k+2(border) for some integer value
of k.

The image sizes are width = 656 and height = 494 and its a 8Bit Grayscale image.

Could this be wrong??

Treehouse

How is that power of two?
Also, some Intel cards don’t support over 512x512, while lying in the caps-queries.

I tried this in the init function:

glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, m_image.width(), m_image.height(), 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL); 

And in the paint function i use this:

glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 128, 128, GL_LUMINANCE, GL_UNSIGNED_BYTE, m_image.bits());

But now the the “glTexImage2D” fuction returns the error:
“INVALID OPERATION”

The documation meas that this error has the following reason:

GL_INVALID_OPERATION is generated if glTexImage2D is
executed between the execution of glBegin and the
corresponding execution of glEnd.

But i don´t use these functions before i use the glTexImage2D function. :stuck_out_tongue:

If you get GL_INVALID_OPERATION, then it would be because of crappy drivers.
But first, is your image dimensions power of 2?

Maybe try a debugger such as GLIntercept http://www.opengl.org/wiki/Debugging_Tools

try arb texture rectangle
i use this extension for my opengl video renderer, since up until recently ATI had no proper support for non power of two textures.

My intel mobile card do not support this feature.

:slight_smile:
656 x 494
Manually fix this image to be 512x512. gluBuildMipmaps2D won’t work, I think (which would have resized the stuff in software for you), because it queries the drivers for max-texture-size and the drivers lie. 512x512 is the max, instead of the reported 2048x2048 (iirc).
Image quality is the last of your concerns with that igpu, so this manual resizing isn’t such a bother.

Hi Ilian,

i tried out to fix the image size to 512x512 with the following code:

glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, 512, 512, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL); 

But it don´t work i get every time the error message that this operation is not valid.

GL_LUMINANCE is then not supported, either.
Seriously, you’re dealing with some of the worst silicon and drivers this century has seen. So, start with a “won’t work” mindset and brute-force out the things that do work. That way I found-out the real limits of similar (but newer) other Intel gpus. Couple that with random crashes and onscreen-garbage :slight_smile: , and I concluded to not support them (disabling HW acceleration) in professional software.