I’m working with 3D textures of medical images. To transform one image to another’s frame of reference, I build a 3D texture, transform it with an appropriate matrix, and then extract slices of the transformed image using quads and read the voxel data with glReadPixels. All works well with one weird exception, which fails dramatically by what look like a corrupted address space.
If an image has a width/height aspect ratio >= 24 (thin image viewed edge on) and the narrow dimension is an odd number, something, often glReadPixels, will fail, but point of failure can vary. Lower aspect ratios (“squarer”) images with odd dimensions work well (I think, haven’t tested that extensively), and changing an odd image size to an even number by adding or removing one slice of voxels seems to fix the problem. However, I don’t want to muck with the image, even temporarily, and I’m not really sure I understand the error we enough to know this workaround will always work. This behavior is consistent on two quite different Windows boxes and on a Linux box, all with NVIDIA cards.
I tried searching for messages on this problem, but missed them if they’re out there. Is this a known situation? A game engine I used long ago had a texture aspect ratio limit as well as the old power-of-two limit, but this seems a strange limitation to find now.
Development Environment is:
java: using jogl v. 1.1.1,
OGL: v. 2.1.2,
renderer: Quadro NVS 135M/PCI/SSE2
Example image sizes: 21 x 512 x 512 fails, as does widths of 17, 19, and 23. Alternate even widths do not fail. Note that creating a 512 x 512 x 512 3D texture with a proxy call indicates that OGL can handle this image, so image size should not be the problem with this much smaller image.
Thanks for any information anyone can provide.