GL_EXT_framebuffer_object missing on GeForce 7050?


any idea why one computer with GeForce 7050, standard 163.75 driver, XP SP2 reports it doesn’t support GL_EXT_framebuffer_object?

glewIsSupported(“GL_EXT_framebuffer_object”) returns 0 although many other older GeForces pass this test.
The owner of the computer says he made no tweaks in control panel or any sort of development tools.


Extension verification:
GL_ARB_draw_buffers was not found, but has the entry point glDrawBuffersARB
GL_ARB_fragment_program was not found, but has the entry point glProgramStringARB
GL_ARB_fragment_program was not found, but has the entry point glBindProgramARB

I still hope some setting in control panel disabled all extensions, although the computer owner says he made no changes.

i dont see how this is possible, a worrying sign

(sorry to hijack the thread) but ive a related issue with a user on a laptop gf7300,
now with glsl some warnings get flagged as errors
vec3 color = gl_SecondaryColor;
with me this generates a pedantic warning (should be
but for them its an error, is there some driver switch to disable these

theres this
theyre using old drivers though 86.47

Another identical report from GeForce 6600, simple extension is not supported…definitely not GPU related, it must be some setting in driver/registry/controlpanel…

It’s possible that the user has their display set to 16 bit color which would cause ChoosePixelFormat to return a Microsoft software accelerated rendering context.

BTW, possibly related story.

5 years ago, starting with some version of drivers, lots of extensions disappeared from my GeForce2. They reappeared only if I reinstalled old driver. With new driver, no tweaks helped (I tried all tools from nvidia, all settings of nvemulate etc)… only reinstall of windows fixed it.

I set 32bit fullscreen mode:
Everything is ok, no errors returned, I check that GL version is at least 2.0, but then extension is missing. Extension Viewer shows the same problem… nearly complete report is at

Check you don’t have “Extension limit” ticked on the nvidia display control panel. This would explain why no exported extension, but entry point exposed nonetheless.

And be sure to run you desktop in 24bits/32bits before running you glutGameMode in 32 bits too.

Yes, it was “Extension limit”, one user confirmed it. Thanks!

Why, is it known to do something bad when desktop is 16bit and requested gamemode 32bit?

Why would you want to disable it? Wouldn’t it make more sense to look for a way to make it an error (as it should be) on your system as well?