GeforceFX

>>You mean their DX wrapper? (Or does it state in the GL specification that GL uses DX error messages when it bombs?)<<

if u mean what i think u mean.
like d3d, opengl often uses directdraw under windows for the actual rendering

and in case your next question is
“in that case d3d must work better cause d3d + directdraw are a team”
the answer is
NO

Humus: now it reads:

The new unified driver provides robust OpenGL® 1.4 support, with 2.0 extensions, for many of ATI’s award-winning graphics boards including:

Originally posted by FXO:
[b]Humus, the quote was actually:

The new unified driver provides robust OpenGL® 1.4 support, with 2.0 extensions, for many of ATI’s award-winning graphics boards including:

Or they have changed it since you visited.[/b]

Ah, they have changed it now. I merely did a copy’n’paste from that page. I suppose there were more confused people asking so they clarified it.

[This message has been edited by Humus (edited 11-22-2002).]

Originally posted by Humus:
[b] Ah, they have changed it now. I merely did a copy’n’paste from that page. I suppose there were more confused people asking so they clarified it.
[b]

Blush.

You’ll want to check the page in a few hours as it gets clarified yet again.

It will eventually just say “OpenGL 1.4 support for many of ATI’s award winning graphics boards…”

ATI does not currently export any GL2 extensions in the Linux Driver Version 2.4.3.

ATI is currently leading the OpenGL 2.0 Working Group, aka arb-gl2.
http://www.sgi.com/newsroom/press_releases/2002/september/opengl.html

The OpenGL 2.0 Working Group is currently working on a draft shading language specification and three draft GL2 extensions. See the September 2002 ARB minutes.

Repeat, ATI does not currently export any GL2 extensions in the Linux Driver Version 2.4.3.

Sorry for the confusion.

-mr. bill

What pussels me is that while OpenGL2 is not complete, how can 3Dlabs ship a card with GL2 support and how can carmack support a GL2 rendering path in Doom3?

Since nobody from NVIDIA/ATI has responded, do any of you know their plans for supporting GL2?
Will they ship new GL2 specific boards, or will they make some of the “older” boars GL2 compliant through drivers also?

The reason OpenGL was “looking into the future” when it was first released was that they were looking at what high-end graphics stations could do at the time, and tailoring the API to that. Once consumer hardware caught up, they started to have to add extensions, because the API didn’t go further than that.

Unfortunately, there’s not a whole lot that high-end stations do today that’s much different from the consumer hardware – in many ways, the consumer harwdare is leading the way. So they don’t have an already working, optimized implementation to borrow all the good bits from anymore.

Also, I don’t think OpenGL 1.0 looked into the future as much as some would have it. Little things like TEXTURE OBJECTS were missing from that version…

Originally posted by FXO:

Since nobody from NVIDIA/ATI has responded, do any of you know their plans for supporting GL2?

I am from ATI. (And Cass is from NVIDIA btw.)

ATI is leading the OpenGL 2.0 working group.

ATI helped present the 1/2 day Siggraph 2002 OpenGL 2.0 course.

ATI also presented a technology demo of the GL2 shading langauge at Siggraph 2002 running on a Radeon 9700.

Hopefully the next time ATI announces OpenGL 2.0 support (more correctly, GL2 extension support) I won’t have to post a retraction. [insert silly smiley face]

-mr. bill

Thanks, I’ll check out the siggraph presentation.

Sounds like there is a slight chance that the r300 will become GL2 compliant later on, correct me if im wrong.

doesnt it need loops in the fragment programs to be ogl2 compliant? Also for that presentation did you just use the non looping shaders to show it off?

Originally posted by FXO:
What pussels me is that while OpenGL2 is not complete, how can 3Dlabs ship a card with GL2 support and how can carmack support a GL2 rendering path in Doom3?

They made a GL2 driver(beta) for working on GL2. It is distributed to all the companies that are collaborating. Not much hype happening now but GL2 is getting ready.

Im not sure if memory is failing me, but I think I read “the first GL2 card” somewhere. It can do loops in vp. It is suppose to be fully programmeable, so it should be GL2 ready.

Basically, they want to make a GPU as programmable as a CPU, but a zillion times faster at 3D graphics.

Who knows, but maybe in 10 years, all we will need is a GPU who’s circuits could also double as standard CPU. For example, the MMX unit could be used for certain operation for 2D graphics, but could also be used for sound.

The GPU might become the center of the PC.

V-man