High aspect ratio odd dimension 3D tex failure

I’m working with 3D textures of medical images. To transform one image to another’s frame of reference, I build a 3D texture, transform it with an appropriate matrix, and then extract slices of the transformed image using quads and read the voxel data with glReadPixels. All works well with one weird exception, which fails dramatically by what look like a corrupted address space.

If an image has a width/height aspect ratio >= 24 (thin image viewed edge on) and the narrow dimension is an odd number, something, often glReadPixels, will fail, but point of failure can vary. Lower aspect ratios (“squarer”) images with odd dimensions work well (I think, haven’t tested that extensively), and changing an odd image size to an even number by adding or removing one slice of voxels seems to fix the problem. However, I don’t want to muck with the image, even temporarily, and I’m not really sure I understand the error we enough to know this workaround will always work. This behavior is consistent on two quite different Windows boxes and on a Linux box, all with NVIDIA cards.

I tried searching for messages on this problem, but missed them if they’re out there. Is this a known situation? A game engine I used long ago had a texture aspect ratio limit as well as the old power-of-two limit, but this seems a strange limitation to find now.

Development Environment is:
java: using jogl v. 1.1.1,
OGL: v. 2.1.2,
renderer: Quadro NVS 135M/PCI/SSE2
Example image sizes: 21 x 512 x 512 fails, as does widths of 17, 19, and 23. Alternate even widths do not fail. Note that creating a 512 x 512 x 512 3D texture with a proxy call indicates that OGL can handle this image, so image size should not be the problem with this much smaller image.

Thanks for any information anyone can provide.

beamrider1.

My experience (with nVidia drivers) shows that the proxy calls fails only if the format is not supported. At a first step, you’d better check the value returned in maxSize by a call to:
glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE,&maxSize)

then, test with a proxy texture loading, then test with a real texture loading.

It has been much more reliable this way for me.

Ah, I have checked maximum conditions, as mentioned above, but I have not checked the specific texture sizes, data types, etc. for a texture that is failing.

Thanks for the suggestion, I’ll try that.

Beamrider1

Sounds a bit like this thread, maybe the pixelstore will help :
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=263103#Post263103

Aha, that was it. The aspect ratio was a red herring, the problem was using short data (gray scale intensity rather than RGB type values) instead of integer data and having the default row alignment of 4 bytes be wrong for odd numbers of voxels per row.

Thanks for your help.

Beamrider1