I am implementing a voxel-based global illumination feature. I have a VBO full of voxel positions, and I can successfully light them and store the resulting colors in a texture or set of vertex buffers. Each voxel uses an unsigned byte / uvec3 for the position and six unsigned byte RGBA vec4 values for the colors of the six box faces.
Now I need to render this data into a 3D texture and downsample it. There are a few ways to do this but I want to implement the most efficient method.
At the moment I am working with 170,000 voxels, each with a uvec3 indicating its integer position in the grid. The biggest possible voxel grid is 256x256x256 voxels, which means I would either need six 3D textures of this size, or one that is 1536x256x256.
OpenGL 4.0 only guarantees eight texture attachments to an FBO, or four separate transform feedback outputs. Interleaved transform feedback outputs are pretty much unlimited (128).
What is the most efficient way I can populate a 3D texture with my voxel data? After the data is entered into the texture I want to downsample the texture to create several mipmaps for cone tracing.