Shader Output clipping, problems with API


I have a basic Problem between beginners and advanced and need some clear answers.

I’ve Problems with use of extensions and shaders. I am not sure if my libs are old or my dlls, but I can not found any possibility to download new versions of them. I installed latest drivers from Hardware but i’m not sure anyway.

First think: I use an ATI X1600 Mobility under Win32. GL_EXTENSIONS says that it has only support for shading language 1.00! That can’t be, can it?
What do I need to get access to higher features?

At home I am using that ATI X1600 - at work I using an X850.
I write an application that using float Texture and Framebuffers. I fill textures with values outside the range [0,1] and render them to the Buffer. I using a shader, but let him only forward the Texture values, so there is no other computation.
On X1600 i got the right values, but on X850 I got values clipped to [0,1]. Maybe diffrent APIs, but I don’t know if this is a problem of hardware or software.
The color_buffer_float extensions is not supported from both of them (glew said).
If this isn’t a Problem of SDK/API who can I check this without rendering some Testdata.
The clipping is bad, because I try to do some numerical computations on GPU.

Thanks a lot for Answers!


You probably can update the drivers from your laptop manufacturer website.
GL 2.0 is supported on R9500 and above, which means GLSL 1.10
Visit, hardware section and see what extensions are supported by your card. There are many diver versions and not listed in order unfortunately.

The ARB extension specifies shading language 1.00 only. That’s why the extension string says ARB_shading_language_100. GLSL 1.10 is defined in the GL 2.0 core, so it’s not mentioned in the extension spec.

Just check if you have GL >= 2.0 and use the core functions, instead of the ARB ones, then you should have GLSL 1.10 available.

I am very greatfull for your help, and i think this is not very well documented.
Did I understand you right?:

GL_EXTENSION shader language is every time 1.0 because it’s released as ARB(!) in this version. As they included the Shader Language to GL2.0 its no extension and so it has no need to be in the Extension string.

If I’ve GL2.0 and using something like glCreateProgramObjectARB I am already using the 1.0 Version because it references the 1.0 Version. If I remove the ARB Suffix I am using the core and so whatever my hardware supports?

It seems to me a little confusing but if I understand this in the right way it helps a lot.



The GLSL API changed a bit from ARB to 2.0 core, so it’s not “just” removing the ARB suffix.

Have a look at this tutorial:

It describes both the old ARB API and the new GL2 API, so you can see the differences.

When you use the ARB functions, you get version 1.00. With the new GL2 functions, you get at least version 1.10. When you need a higher version, you need to put a #version directive in the shader (but I think no driver supports 1.20 yet).

When GLSL was originally formulated as an extension, it was called GL_ARB_shading_language_100. However, it was later decided that having a separate extension string for every version wasn’t the way to go. Instead, call glGetString with GL_SHADING_LANGUAGE_VERSION[_ARB] to get the actual version (similar to GL_VERSION).

You will get the same version whether you’re using the ARB extension API or the core API, since the version tokens have the same value.