Is it possible to generate 3D textures out of 2D textures on the fly by appending data of multiple images with the same widths and heights and then passing this as a 3D texture to OpenGL?
Yes: allocate a 3D texture with glTexImage3D() by passing a NULL pointer for the last “data” parameter. Then, for each 2D slice, transfer the data to the 3D texture with glTexSubImage3D().
Just a quick idea: what if I append the 2D images before runtime, one after another, the widths and heights of 2D images all being the same, as, for example, with Imagemagick and then load the image data and pass it as a 3D texture or cube texture to GL?
i.e. i do something like this
convert i0.jpg i1.jpg i2.jpg i3.jpg -append result.jpg
the image then looks like this:
then I load result.jpg with, for example, devil and pass it as a 3D texture. Would this also work with cube textures?