OpenGL 2.0 drivers

I didn’t know where to open this topic, may be here is the best.
So. Will be any OpenGL 2.0 drivers for the older graphics cards?
I mean Geforce4 and ATI Radeon8xxx cards are great, but don’t support fragment shaders.
I don’t need this feature yet, but I would likely use the GL2 on these cards (because of the new opengl objects).

Originally posted by Csiki:
[b]I didn’t know where to open this topic, may be here is the best.
So. Will be any OpenGL 2.0 drivers for the older graphics cards?
I mean Geforce4 and ATI Radeon8xxx cards are great, but don’t support fragment shaders.
I don’t need this feature yet, but I would likely use the GL2 on these cards (because of the new opengl objects).

[/b]

It looks like there will be no OpenGL2 driver for old video cards.

If you have something like R9500 or later you may download 3.4 driver. There is a beta OpenGL 2.0 (Only glSlang GL2_fragment_shader GL2_vertex_shader GL2_shader_object) you may found them if you view atioglxx.dll with FAR or other text editor. Unfortunately they’re no exact specifications on it yet. But it’s very interesting. It looks like there is a glslang1.0 implementation in it (3.4) where gl_FBColor and others are supported.

[This message has been edited by IDen (edited 05-21-2003).]

Originally posted by IDen:
[b]It looks like there will be no OpenGL2 driver for old video cards.

If you have something like R9500 or later you may download 3.4 driver. There is a beta OpenGL 2.0 (Only glSlang GL2_fragment_shader GL2_vertex_shader GL2_shader_object) you may found them if you view atioglxx.dll with FAR or other text editor. Unfortunately they’re no exact specifications on it yet. But it’s very interesting. It looks like there is a glslang1.0 implementation in it (3.4) where gl_FBColor and others are supported.

[This message has been edited by IDen (edited 05-21-2003).][/b]

I have a Geforce4 Ti4200.

Hmmm…my info is a little bit different.I heard that there will be OpenGL 2.0 for all accelerators(ofcourse if the vendors want it).The thing is that OpenGL 2.0 what be separate with OpenGL 1.x.It will cover the old features and will add new ones.

There is always the option of running in software mode. For older cards like the R8500 where it will always run in software mode, even for simple shaders, because it lacks floating point capabilities it is questionable if adding GL2 shader support is useful. Then on the other hand you’ll need to have the software fallback path for cases where you can’t accelerate things on highend chips either, so it may be more or less for free to support a software path for older chips.

Originally posted by Humus:
There is always the option of running in software mode. For older cards like the R8500 where it will always run in software mode, even for simple shaders, because it lacks floating point capabilities it is questionable if adding GL2 shader support is useful. Then on the other hand you’ll need to have the software fallback path for cases where you can’t accelerate things on highend chips either, so it may be more or less for free to support a software path for older chips.

Doesn’t the 3DLabs P10 processor lack floating point? They have GL2 “support”.

For older cards like the R8500 where it will always run in software mode, even for simple shaders, because it lacks floating point capabilities it is questionable if adding GL2 shader support is useful.

Actually, the thing that limits the 8500 is not it’s floating-point capabilities. Indeed, the 8500 internally used either 16-bit or 24-bit floats (I’m thinking 24-bit, but I’ve forgotten). Indeed, I’m not sure if the fragment capabilities of the 9500+ line have changed much compared to the 8500 line, except in the number of instructions/passes/etc. If that’s true (and there is some evidence for it), then the 8500 line might be able to run some non-trivially simple shaders. Obviously, something that requires more than 1 dependent texture address or uses too many ALU instructions can’t work.

That, and there’s the problem of the 32-varying floats. The 8500 line only supports 6 texture coordinates, which translates to only 24 varying floats. Personally, I think it was a mistake to put a set number into the spec; make it a querryable resource with low manditory limits (like 16).

Is OpenGL 2.0 going to redefine standard drawing calls and state manipulation functions? Or is more going to be more of an upgrade, adding on to what already exists in OpenGL 1.x?

Originally posted by NitroGL:
Doesn’t the 3DLabs P10 processor lack floating point? They have GL2 “support”.

Yes, but if I’m not mistaken their GL2 support is limited to vertex shaders.

And again, I’m getting tired of waiting for anything regarding GL2…

Regarding GL2: until you can install it and use it, it doesn’t exist. Waiting for things that don’t exist is seldom useful (although I’m often guilty of doing this, too :frowning: )

Regarding 1.x support: the recommendation in the original GL2 spec from 3dlabs was to provide 1.x compatibility as a wrapper library on top of the 2.0 driver. Yes, it would be supported. No, it wouldn’t be “core” anymore.

Originally posted by jwatte:
Regarding 1.x support: the recommendation in the original GL2 spec from 3dlabs was to provide 1.x compatibility as a wrapper library on top of the 2.0 driver. Yes, it would be supported. No, it wouldn’t be “core” anymore.

For what it’s worth, I don’t think anybody’s really advocating this position within the ARB anymore. Backward compatibility and continued strong support for the existing installed base is one of the main reasons OpenGL still exists today.

That’s good to hear. I beleive that GL was done right from the get go and if it ain’t broke, don’t DirectX it. Pardon the expression.

From what I remember reading, GL2 was suppose to be backwards compatible. I didn’t know someone wanted to boot out the current GL.

From what I saw (the few docs I skimmed), GL2 will be a big introduction. There will be a lot of reading to do to understand and use the features.

glVertexArrayPointer
glDrawIndexArrays

and look at these funny ones
glDrawArraysAsync
glDrawIndexedArraysAsync

There is stuff about setting up policies, accessing memory directly. All in all, it’s getting more sophisticated.

Are GL2 drivers in the works at NVidia? ATI?

The memory policies will be much more controllable so adding more control over memory resources. Seemingly much more than D3D. Parallelism has also been improved a lot. Like partial syncronization of command execution and issuing, and background processing. The parallel background processing sounds really good. Also, the proposed GLsync should give more information on the execution of single commands or command sequences and the proposed improvement to flush is also welcomed. There is much good in OGL2 and I have high expectations of it. (Hope I didn’t get anything wrong, I’m writing this arround midnight

Cheers,
Pix

Are GL2 drivers in the works at NVidia? ATI?

There is no finalized, approved GL2.0 spec yet. Maybe ATi and nVidia have some of that functionality in the works, but it actual specs aren’t final.

I don’t thing Nvidia will release some GL2 drivers before specs wil become official. They have enought problems with GL1 divers…

Originally posted by Zengar:
I don’t thing Nvidia will release some GL2 drivers before specs wil become official. They have enought problems with GL1 divers…

I don’t know. If they have time to cheat in their driver just for a program, they may have time to write a GL2 driver too.

That’s what I wanted to point out…

Hi all
I tested GL2 Shaders on 9700 & 9800, thanx to IronPeter for code.
Some strange with it, but it work.
Vertex Shader work fine, but simple fragment shader doesn’t work correctly with gl_FBColor.
Shader code:
varying vec4 texc;
void main()
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
texc = gl_Vertex*0.1;
}

varying vec4 texc;
void main()
{
vec4 tmp,t1;
tmp=texture4(0, texc);
tmp=texture4(0, tmp);
tmp=texture4(0, tmp);
tmp=texture4(0, tmp);
gl_FragColor = tmp+gl_FBColor;
}

on ATI Radeon 9800 gl_FBColor have only red component (it’s bug).
on ATI Radeon 9700 gl_FBColor doesn’t add.
But it’s fantastic )

You can download this sample code: http://www.gamedev.ru/download/?id=34

To my knowledge gl_FBColor was dropped from the spec.