Texture3d crashes on ati cards

So slowly but surely i’m running out of ideas. I developed an application to view DTI datasets using 3d textures. Since i developed on my 8800GTX i took certain things for granted which later turned out to be nvidia- or at least dx10 specific. Now i tried to get my application running on ATI hardware again. This time it’s a laptop with a mobility radeon HD3470, which I think is fairly recent.

I narrowed the problem down to this line:

col1 = texture3D(tex, gl_TexCoord[0].xyz).rgb;

The shader compiler complained here, that the dot operator isn’t available for array access, so i changed it into

vec4 texcoord = gl_TexCoord[0];
col1 = texture3D(tex, texcoord.xyz).rgb;

Which compiles without warning but now simply crashes the application hard. I’m running out of ideas. What am I doing wrong here or is it simply so that texture3d doesn’t work on these cards. That would help me too, if I can stop trying and tell my customers to get a nvidia card.

Try (texture3D(tex, texcoord.xyz)).rgb; or
put vec3(texture3D(tex, texcoord.xyz));

thanks for the reply but that didn’t help.

Perhaps it is an precision qualifier problem. Had problems with that in the past, no compiler warnings, but hard crashes. See this topic for more details:

http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=257370#Post257370

(that was a while ago, so perhaps this is fixed/changed in newer catalyst drivers)

Ok now every variable has a precision qualifier, which they shouldn’t need as how brolingstanz pointed out in that other thread it’s a compatibility thing for ES, but I’m at that point where I try everything … and … it still crashes.

Did you also use precision qualifiers in the vertex shader for out variables you use as in variables in the fragment shader (that was the problem in my case)?

It is just a hunch, no idea otherwise what could cause the problem.

Your shader is compiling without any warnings currently?

Yes, if I’d get compiler errors I’d know what to work on.

I meant warnings, but only because I had similar problems when trying to get application to work on ATi. In that case it was something as simple as using texture1D instead of tex1D. I don’t even think the compiler warning showed this but I was just curious.

I’ve not had a problem with texture3d on ATI, and I use them extensively in some fairly complex shaders. Most of mine have a vec3() inside them wrapped around various sets of values to generate the texcoords.

So just some other suggestions…

Try wrapping a vec3() around your texcoord.xyz, or creating it as a vec3 and sticking it in there without the .xyz

Also try getting a vec4 or a vec3 result from it without the .rgb extension.

If that does not work then perhaps try downloading Gremedy’s gDEBugger (you get a 7 day trial) and seeing what that has to say about your shader / textures.

Thanks for the replies. It seems ati doesn’t like arrays at all. I replaced all the arrays in code like this

uniform sampler3D texes[10];

vec4 col = vec4(0.0);
for (int i = 9 ; i > -1 ; i--)
{
     lookupTex(col, texes[i]);
}
...

with non-array variables. That looks ugly as hell now but doesn’t crash at least.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.