glDrawPixels crashes


I’m currently working with Win32/OpenGL in WinXP with a NVIDIA FX Go5700 mobile and whenever I change the default GL_ALPHA_SCALE (no mattter, GL_color_SCALE or GL_color_BIAS) crashes, example:

//the above code crashes on glDrawPixels
glPixelTransferf(GL_ALPHA_SCALE, 0.2);
glDrawPixels(10, 10, GL_RGBA, GL_UNSIGNED_BYTE, data); //error crash on nvoglnt.dll

but the exact same piece of code with the default GL_ALPHA_SCALE works!!!:

glPixelTransferf(GL_ALPHA_SCALE, 1);
glDrawPixels(10, 10, GL_RGBA, GL_UNSIGNED_BYTE, data); //works fine

What’s happening? I spent a all week on this and didn’t get any clue to crack this behaviour.

Any thoughts?


Apparently an OpenGL driver bug then. Try a newer driver.
If you don’t find newer ones on you laptop vendor’s website, there is which provides hacked install files to be able to use standard drivers. Use at your own risk!

If the DrawPixels in your code is in any way performance relevant, expect the non-default PixelTranfer to be even slower.

Thanks Relic,

Since I’m using this with a game development project I think I will “hammered” the alpha byte color directly from “data”.

I guess that others will suffer from the same but no one complain’s…

Do OpenGL games like DoomIII use the driver?

Thanks againg for replying Relic,

Do OpenGL games like DoomIII use the driver?
Wrong question.

Doom 3 is an OpenGL program. Users install stuff on their computer from wherever they like. The graphics drivers contain the OpenGL implementation from the graphcis vendor. If the app runs with that implementation, well, yes.

But you shouldn’t care about other programs or where people get their drivers from.
The important question is what OpenGL features do you need to run your application and that needs to be queried by your app via the OpenGL version and OpenGL extension strings, color depth, when it starts.
If you don’t find the correct set of functionality your application either needs to provide a less demanding paths or stop working with a meaningful error message which exactly says what’s missing and then the user might be able to upgrade to a driver which supports that functionality if it’s “only” a software feature.

If the driver crashes on your app, and you determine it’s not your fault, the OpenGL implementation needs to be fixed. => Newer driver.

Sidenote, Doom3 probably avoids glDrawPixels and esp non-default glPixelTransfer alltogether because it’s known to be slow.
If you program a game, you should try to replace your gl*Pixels operations with textured quads.

Yes it is. I put the same code on a Win98 computer and worked!! I guess I wasn’t prepared for a OpenGL driver bug, afterall.

Thanks for the tip Relic, replacing glDrawPixels with texture quads but the later implies getting more detailed work as further work.

By the way, OpenAL as issued new SDK and installer that corrects some bugs including Win98 crash!

Win98? Ouch. :wink: I never used any of that, only NT based.
For such cases there is also a possible way to get a second opinion:
If the OpenGL functionality you use doesn’t need an OpenGL version higher than 1.1, you can select a pixelformat with the PFD_GENERIC_FORMAT in the PIXELFORMATDESRIPTOR’s dwFlags field which will be a Microsoft GDI generic SW implementation if selected. If that doesn’t crash as well that’s a good indication for a driver bug. (Though there are known bugs in that implementation as well.)

The third opinion would be available on NVIDIA by running the SW emulation:
Get this tool and switch on Force Software Rasterization which will be super slow but emulates HW features on the CPU.

I myself think that MS-DOS 3.20 was the best OS in the world. You had the power over machine. Nowdays seems that is the machine who overpowers you…

Well, I tryed the PFD_GENERIC_FORMAT flag on the laptop and got the same crash error.

But with NVIDIA SW emulation with Force Software Rasterization had success!! Thought that is ubber slow…

Really nice tool Relic, thanks.

I guess most common OpenGL games don’t use both glDrawPixels and glPixelTransferf otherwise I had (and other million guys) noticed that sooner :slight_smile:

If you still see the crash when asking for the PFD_GENERIC_FORMAT make sure you actually got the format. ChoosePixedlFormat() is rather dumb.
Either do a DescribePixelFormat() on the pixelformat index and look at the flags in the debugger, or check the glGetString() values for GL_VENDOR and GL_RENDERER to see what you’re using. (That’s always good to know anyway.)

Originally posted by luiez:
I myself think that MS-DOS 3.20 was the best OS in the world. You had the power over machine. Nowdays seems that is the machine who overpowers you…
:slight_smile: I only have had started with ms-dos 6.x but didn’t liked it at all. But I see what you mean.

Once again Relic you were right. It seems that PFD_GENERIC_FORMAT is ignored or I’m doing some mess. In does not document the PFD_GENERIC_FORMAT flag althought it’s declared in wingdi.h.

Basically I’m doing like this:

and got:

Wich seems that no Microsoft GDI generic SW implementation present…

By the way Relic, since you are here I have another question. When my win32/opengl window client area is covered by another window the frame rates get to high. 1700 fps for instance and I know that in this situation my graphic card starts heating like hell with noisy sound.

Is this normal? Can be prevented?

Since moste games runs on full window I guess this normal window behaviour is somewhat unexplored…

Thanks to all

You can enumerate the pixelformats by using DescribePixelFormat() and choosing one yourself.
The Win32 API function ChoosePixelFormat() ignores some flags, just as you experienced.

The PFD flags are documented here:

If a window is fully covered there isn’t much to render due to the clipping situation (OpenGL pixelownership test fails!) and the SwapBuffers() shouldn’t do anything either.
In that case the graphics engine actually has nothing more to do than to find if any pixel has to be rendered. Since there isn’t, are you sure the graphics is what’s overheating?
You’re on a laptop, the system probably goes into 100% CPU boundedness if you render some geometry and I would suspect the CPU to heat up instead.
You can limit your framerate if you want to avoid excessive framerates. There are multiple ways to do that, timers in your app, waiting for vblank, etc.

I finally did it!!! Hurray!

Using “correctly” the DescribePixelFormat I managed to put:
GL_VENDOR=Microsoft Corporation

and see with this eyes that glPixelTransferf(GL_ALPHA_SCALE, 0.2); works after all with Microsoft GDI generic SW implementation. So I guess that the bug is in OpenGL hardware driver layer.

Back to high frame rates with client area covered (by another window) I was able to check the nVIDIA Temperature Setting rising (GPU core temperature much much higher than Ambient Temperature).

Althought the CPU % usage rises too (perhaps 50%) since PeekMessage() became too active (with no events queue handled).

Didn’t got satisfactory results with timers (drop from steady 60 fps) I think I will try further work on vblank later on…

Once more thanks.