Uniform definition in different GLSL versions

Hello!

I’m in the process of converting my engine to OpenGL 3.2. I’m using Java and JOGL 2.0 bindings.

In JOGL I specify a profile (GLProfile.GL3) wich lets me use OpenGL 3.0 and GLSL 1.3. When I remove all the deprecated functions from my code I will demand a GLProfile.GL3 profile wich supposedly will let me use OpenGL 3.2 and GLSL 1.5 wich is the best my graphics card can provide (NVidia 8600GT).

My question is related to definitions of uniforms. I’ve read that the keywords ‘uniform’ and ‘varying’ are deprecated and you always use ‘in’ and ‘out’ in the different shader stages.

So the question is, in wich version are these keywords deprecated, and from wich version can I use the in, out keywords instead?

The point is to migrate gradually parts of my code so I can define for example the matrices wich substitute gl_ModelViewMatrix and such, before commencing to use array buffers with custom attributes

the varying keyword is deprecated, but the uniform definition is unchanged.

IIRC from version 1.30 using the core profile of the glsl (OpenGL 3.0) you have use the in/out/inout qualifiers for varyings and attributes.

Ok, thank you.

I will demand a GLProfile.GL3 profile wich supposedly will let me use OpenGL 3.2 and GLSL 1.5 wich is the best my graphics card can provide (NVidia 8600GT).

Don’t current NVIDIA drivers support GL 3.3?

The point is to migrate gradually parts of my code so I can define for example the matrices wich substitute gl_ModelViewMatrix and such, before commencing to use array buffers with custom attributes

You can’t do that. Well, not according to the spec. It is illegal to use any identifier with the “gl_” prefix; these are all considered reserved by the system. I certainly haven’t tested it, and changes are that NVIDIA’s rather loose GLSL compiler will allow it. But I wouldn’t get too attached to it.

An alternative is to use the preprocessor. #define each of the “gl_” items you are using. Then, just swap what the definitions translate out to when you’re making the change.

Don’t current NVIDIA drivers support GL 3.3?

I suppose yes, but I’m getting versions gl 3.2 and glsl 1.5 when I query with glGetString( GL_VERSION ) and glGetString( GL_SHADING_LANGUAGE_VERSION ). That is, when i use a GL3 profile in JOGL (wich my code still doesn’t support and then crashes).

When I use instead a GL2 profile (wich oddly supports 3.0 OpenGL), I get Opengl Version 3.0 and GLSL 1.3.

I suppose that the 3.2 limit I get is for the current graphics card installed. When I had installed a lower card (5600GS), the OpenGL and GLSL versions were lower also.

You can’t do that. Well, not according to the spec. It is illegal to use any identifier with the “gl_” prefix; these are all considered reserved by the system. I certainly haven’t tested it, and changes are that NVIDIA’s rather loose GLSL compiler will allow it. But I wouldn’t get too attached to it.

Fortunately, using the GL2 profile I mentioned and placing a ‘#version 130’ in the shaders, the compile log outputs warnings about the ‘varying’ and ‘gl_*’ being used, but it continues working :slight_smile:
Edit: If I set the #version to 150, the warnings are now errors. Well, I will be able to migrate using version 130.

An alternative is to use the preprocessor. #define each of the “gl_” items you are using. Then, just swap what the definitions translate out to when you’re making the change.

Sorry, I don’t understand you here. What should I #define the ‘gl_’ to?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.