Why not make the the pixel shading extensions standard?

Few things I don’t understand about ‘politics’ behind extensions :slight_smile:
First of all, it seems that certain vendor specific extensions make it as ARB standard extensions and included as part of OpenGL. The question now is at that point why are they not part of the OpenGL CORE? I mean if ARB decides that those extensions are now standard, why keep those as extensions?
On the other hand, why is there so much crying about including a pixel shading language (glshade?) in v2.0 of OpenGL, if it seems that all the main manufacturers already have some form of pixel and vertex shaders as part of extensions, right? Why not just get those and make them part of the core as opposed to waiting enother year or two for the ARB to come up with a totally brand new standard glshade language? By that time DirectX will have pixel shading 10.0 which might again be ahead of the pack?
?
:slight_smile:
Thanks,
Luke

the reason is that microsoft does not support opengl anymore, so for compatibility reasons, everything above ogl 1.1 HAS to be an extension.

and it’s OpenGL you are “ahead of the pack” with, as you can use any great new feature from any vendor immedilately (extensions), while if using d3d, you cannot use it until mirosoft decides to implement it.

“the reason is that microsoft does not support opengl anymore, so for compatibility reasons, everything above ogl 1.1 HAS to be an extension.”
Jan, I’m not sure that I understand what does one have to do with another… For instance, there will be OpenGL 2.0 with new shading language, right? Will that pixel shading also be an extension? then what is the point, there are already some great pixel shading extensions, I hear?
Thank you,
Luke

Opengl requires that everything in the core exitst in the implementation, that is, if the driver report 1.4 you must have everything that the version includes in the driver, if the card cant handle it, it should be software emulated.

Direct3d have CAPS, that means that the specification (for example dx9) just specifies which functions you can include, but you dont have to support them all. If you write a driver that reports all caps as false or their minimum values you end up with a driver that nearly only can handle colored triangles, but still is a valid dx implementation.

the ARB extensions can be thought of as the d3ds CAPS, you check them, and if they required one exits you use it, else you do a workaround or just fail.

So things that are pretty unusual at the moment arent needed to put into the core, and by that you ensure that more drivers can advance to the new version, and those cards that can handle the extensions still can implement them.

For example, if ARB_fragment_program went into the 1.5 core you end up with a handful of cards that can implement it at in HW at the moment (gffx and radeon 9500+), and everyone else had to stay on 1.4 even if they can handle point_sprites and VBO ( that are new 1.5 core features ) or implement fragment_program in a fallback software path, that are pretty useless.

Extensions don’t have to make it into arb before going into core. There are several ext extensions that went straight to the core without going thru arb first. Also, some extensions change behavior after going from ext to core. You can query gl version then pull features out of the driver without worrying about extensions but I don’t recommend that. Instead use the extensions query function and use extension docs. Some exts. are enums only and can’t be queried from the driver. So if you query for gl 1.4 for example you don’t know if the enums are supported as well.

Basically, I use gl 1.1 and everything else like 1.2 etc. I use as extensions. I don’t query gl driver version just extensions. This is because Ati and other hw might expose lower gl version but support higher functionality thru extensions. Nvidia I think is one of few companies that updates their gl core version frequently. So although Ati exposes 1.3 version, they have same feature set avail. thru extensions. If your game relies on 1.4 version then you’re going to miss Ati eventhough it can do the stuff thru extensions.

Those ARB extensions are two fold. Part of them is in the core and part of them are still as extensions. The gl core essentially doesn’t exists on windows because MS dropped the ball and users need to use GetProcAddy() to get at them. They’re not automatically exported from the dll. I use nvidia extensions docs and sgi docs for gl conceptual overview. I’ve created my own gl/glx headers and exts. loading lib and everything is peachy here This way I have only those extensions in my headers which I use in my apps. I slowly add more exts. to the headers as I find use for them.