How do I pair an index buffer to a vertex buffer without 'glCreateBuffers' & 'glNameBufferData'?

I had some 'GL_INVALID_OPERATION’s that lead me to notice that glNamedBufferData was available in gl4 only, is there a way to achieve the same in gl2?

I want to be learning the most portable of gl code 1st since that is what would be available the most when I try my code on other systems, although if gl4 can be made available even on Micro$hit which I’ve read somewhere only supports upto opengl 2.1 then I’ll drop the requirement on that front

glGenBuffers, glBindBuffer and glBufferData.

The glNamedBuffer* functions are just direct-state access (DSA) versions of the corresponding glBuffer* functions. DSA allows you to specify objects (buffers, textures, renderbuffers, framebuffers) by name (handle, ID) rather than having to bind them to a target then specify the target.

The supported OpenGL version is determined by the hardware and driver. Windows’ opengl32.dll only exports the OpenGL 1.1 API; any functions added by later versions or extensions are accessed by using wglGetProcAddress to get a pointer to the function. Often a loader library such as GLEW is used to handle this, in which case you need a version of the loader which includes the required functions.

Thanks, so I take it that I just need to make a wrapper function for glBindBuffer & glBufferData paired together and just ignore glNamedBufferData to get closer to a 1.1 only API?

No.

It’s impossible to get a 1.1 only API using any thing at all to do with vertex buffers, because vertex buffers did not exist at all in 1.1.

If you want to use a 1.1 API then the simple solution is to just use 1.1.

If you want to use GL 4.x on Windows, then you absolutely can. Wherever you sourced the information from that Windows only supports 2.1 is a load of old nonsense.

The primary factor that determines which GL version is supported is hardware. Not software, not OS.

Hardware functionality is then provided by the GPU vendor’s device driver. OS is only relevant in that the GPU vendor must make a device driver for the OS you want to run on. Otherwise, stop believing that OS is relevant, and be suspicious of any online source that claims it is.

Using a lower level GL version is not a magic solution to make a program run everywhere. The number of devices that don’t support GL 4.x are vanishingly small, and even smaller the farther back you go. Any such device you encounter in the wild is likely at least 10 years old, has never been updated, has never been patched, is riddled with security holes, and is probably held together with sticky tape and rubber bands. Do you really want to support running on such a device? It’s not unreasonable to demand a somewhat modern driver on somewhat modern hardware.

Ah ok, wasn’t sure because of that tidbit I read somewhere, I have the tendency to assume worst case scenarios when programming which helps reduce uncaught errors which in turn makes my (and anyone else who tries my code) life easier when tracking down the cause of the left over errors. Given what you just posted I’ll just assume gl4 then, it’s decent enough for general coding and likely is the minimum I need to replicate the Dragon Quest Builders 2 engine onto linux (which is the eventual goal, hopefully it in turn nets me a job with I think it was Square Enix)

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.