Fract( XX ) not acceptable in vertex shader

as in, the shader wont compile.
this is happening on my nv3x (81.xx), i have no problems using it in the fragment shader.
is there some restriction about using it in the vs that i dont know about?

The spec does seem to be a bit thin with respect to hardware support for specific functions. But I have no vertex shader problems with fract on a NV40 (GLSL validate says all is well). You sure it’s fract at fault? Seems like that should be fairly easy to support on NV3x hardware… it’s just x - floor(x). My guess is you’ve hit some other limit, or something. Can you post your shader and your actual driver version?

ta, so i guess its allowed

further info

this works
TC = vec3( gl_MultiTexCoord0.x - floor(gl_MultiTexCoord0.x), gl_MultiTexCoord0.y, time );

this doesnt (as in not compiles)
TC = vec3( fract(gl_MultiTexCoord0.x), gl_MultiTexCoord0.y, time );

this also doesnt
TC = vec3( fract(gl_Vertex.x), gl_MultiTexCoord0.y, time );

this is with driver 81.85 on gffx5900
im guessing a driver problem

It’s allowed and that looks like perfectly valid code to me. Compiles fine on ATI.

ok downloaded the latest official drivers 84.21 and the problems still there,
can someone with a gffx try it out the fract command in the vertex shader to confirm this so i can get onto nvidia to fix it.

Maybe its ‘frac’, as in CG?

very very sorry this was my error, unbelieveable stopid of me.
when i was loading in the vs ild check for the letters ‘fra’ (cause i seperate the vs + fs’s by sticking the following words in my shader src
// fragment shader

i have no idea why i only checking for the first 3 letters and not the whole word fragment but (hey im an idiot) but even then that will cause a problem with eg (unlikely but possible)

// vs //
void main
// fragment shader will get the current gl color
gl_Color = gl_FrontColor;

i suppose the only foolproof way is to have the vs + fs in 2 seperate files

You could use full tags like [Vertex shader] and [Fragment shader] like I do rather than putting it into a comment. Almsot no chance of screwing up with that.

Btw, what was the reasoning behind spec’ing it this way, that the shader entry point has to be called main? I’ve been spreading things out in separate files, but this is becoming a real nuisance. Well, perhaps it’s only a mild annoyance ;-).

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.