GLEW and the core profile

So today I was finally ready to switch over my application to a core profile - and I got a crash.

Upon investigation I noticed that GLEW doesn’t seem to retrieve any functions for vertex array objects.

The crashes happened because both glGenVertexArrays and glBindVertexArray are NULL.

Unfortunately GLEW’s source code is quite messy and I couldn’t fully comprehend what it does. All I noticed is that it still uses glGetString(GL_EXTENSIONS) to retrieve the extension string - which of course has been deprecated for the core profile. So my question now is: Is GLEW even capable of running on a core profile context or not? Do I have to switch to another GL loader library to make it work?

This happened on a Geforce 550Ti, latest drivers, while requesting a GL 3.3. core profile context, GLEW version 1.10.0

Nevermind, I already found something myself.

Still, are those GLEW guys complete idiots or what, still insisting on making their library depend on a deprecated feature? I don’t get it.

I’m sorry, glGetString is deprecated? Since when?

[TH=“colspan: 2, align: left”]Core in version[/TH]|4.4|

[TH=“colspan: 2, align: left”]Core since version[/TH]|1.0|

I don’t use GLEW anymore due to various reasons, main one being I switched to glLoadGen because of the lovely gl:: scoped headers it can create. I simply can’t get enough of that. However, the usage of glGetString is neither deprecated nor wrong and the problem you are having is, as far as I can tell, a problem with the recent nVidia drivers. I have not spent much time trying to diagnose this, but I can tell you that this (or at least something similar to this) happens to me only when running a debug build of my current application under nVidia (when compiled with full optimizations it does not crash) and it can be “solved” by deactivating “Threaded Optimization” on the nVidia Control Panel.

I’m not claming that it necessarily is a bug on the drivers, but it’s certainly something that has surfaced in the more recent editions of it. I still have to sit down and analyze my code to see if I find any strange artifacts, but I’ve isolated it enough that I can make it crash by simply creating a core profile and calling glGetString right after (that is, skipping glLoadGen’s initialization completely).

You may want to try out the work around I mentioned and see if that fixes your problem.

My money is on a driver bug, but for now I’m giving them the benefit of the doubt till I have time to address this.

EDIT: Disregard what I said about glGetString crashing. It’s wglGetProcAddress what causes the crash, as I explain in the follow up.

So what did you find? It would be nice to share your findings, shall someone down the line tries to search for information on this and finds your post.

It’s not just deprecated, glGetString(GL_EXTENSIONS) is not available in core profiles, you have to use glGetStringi().

[QUOTE=Ed Daenar;1260519]I’m sorry, glGetString is deprecated? Since when?[/QUOTE]

Use of GL_EXTENSIONS as a parameter to glGetString is deprecated. In other words:

[li]glGetString is not deprecated…[/li][li]…but GL_EXTENSIONS is no longer a valid parameter.[/li][/ul]

Please see the documentation at -

Specifies a symbolic constant, one of GL_VENDOR, GL_RENDERER, GL_VERSION, or GL_SHADING_LANGUAGE_VERSION. Additionally, glGetStringi accepts the GL_EXTENSIONS token.

GLEW is designed with some very wrong assumptions and it doesn’t seem that it will be fixed anytime soon.

The problem you encountered is that GLEW only looks for the extensions string. But in core profiles some features are always present and therefor OpenGL does not shows any extensions string for them. But then GLEW won’t even try to get the function pointers…

Also GLEW checks against the existence of all function pointers of an extension and tells you the extension does not exist if even a single function pointer is NULL.
This for example causes it to always tell you there is no GL_EXT_direct_state_access.

For my own use I just changed the scripts that creates the c/h files and removed some of this nonsens “checks” that break stuff.
The relevant file is: glew-1.10.0/auto/bin/
Here is my dirty fix (v 1.10.0):

To build glew on linux go in the glew-1.10.0 directory and:
cd auto
make clean;make
cd …
make clean;make

[QUOTE=mhagain;1260521]Use of GL_EXTENSIONS as a parameter to glGetString is deprecated. In other words:

[li]glGetString is not deprecated… [/li][li]…but GL_EXTENSIONS is no longer a valid parameter. [/li][/ul]

Please see the documentation at -[/QUOTE]

Ah, indeed, even the wiki page says this about the token:

For glGetStringi only, returns the extension string supported by the implementation at index​. The index index​ is on the range [0 to glGetIntegerv(GL_NUM_EXTENSIONS)​ - 1].

However, since we are talking about nVidia drivers as well here, I have to point out something that I had wrong on my initial post: the nVidia drivers crash when calling wglGetProcAddress() after a core profile is made under the circumstances I outlined, not glGetString(). So the problem with the OP may not be the same as what I’m encountering

[QUOTE=Ed Daenar;1260519]
So what did you find? It would be nice to share your findings, shall someone down the line tries to search for information on this and finds your post.[/QUOTE]



before the call to glewInit will make it ignore the extension string and try to initialize everything it has. The entire library is a cruel joke that operates under completely obsolete assumptions, that JUST happen to be right for compatibility profiles, but only if all extensions get reported. GLEW doesn’t classify functions by version, just by extension, it seems.

I would have dumped it for another loader library, too, if the project wasn’t so utterly dependent on it and changing this would amount to a multi-day undertaking.