glTextureSubImage2D vs GL_TEXTURE_CUBE_MAP

Hello!
In OpenGL4.5 we got DSA providing the alternative texture uploading functions that take a texture ID instead of the target as a parameter:

void glTextureSubImage2D(
GLuint texture,
GLint level,
GLint xoffset, GLint yoffset,
GLsizei width, GLsizei height,
GLenum format, GLenum type, const void *pixels
);

If I remember correctly, before DSA we uploaded mipmaps for cubemap textures using glTexSubImage2D specifying different targets to distinct between the cubemap faces (GL_TEXTURE_CUBE_MAP_NEGATIVE_X and others). How do we do the same with DSA now?
Thanks! :slight_smile:

A cubemap under DSA is treated as though it were a 2D array texture with exactly 6 layers. You could likewise consider it to be a cubemap array with only one layer (and therefore 6 layer-faces). It’s the same either way.

You use glTextureSubImage3D.

Thank you, Alfonse!

P.S.: man4 pages have more bugs than information. Shame no one maintains the documentation.

Wow… I was really disappointed to see that error there. Sure, it’s an obvious “copy-and-paste” bug, but the DSA issues section made it clear that the ARB really thought this stuff through. So I would have expected them to be a bit more careful on an issue that DSA made such a point to change.

They added a whole table and everything to address this.

sigh Good thing I have Khronos’s bugzilla on speed-dial…

Clearly, that was one of a hundreds typos/mistakes I mentioned for a last few month. The whole docs have to be “bulldozered”, function-by-function.