NVidia: OK, ATI X1650: OK, ATI 2600: Oh hell!

That’s just the thing, there are no errors. And everything works fine on older ATI cards that use the old drivers.

You don’t have to enable GL_TEXTURE_2D for ATI do you?

I know all the new macs have ATIs.

ATI are apparently taking the workstation market very seriously these days. I’ve seen no evidence of this. Maybe it will change in GL3 and their programmers don’t have as much of a ‘challenge’.

Especially the BIOS on the graphic cards…

IMacs = ATI.
MacBookPro = NVidia.
MacBook, MacMini = Intel.
MacPro = choice of 2 different NVidia or 1 ATI.

Still no response from ATI. I am moving on.

Maybe it will change in GL3 and their programmers don’t have as much of a ‘challenge’.

That’s the “funny” part of it: ATi just finished rewriting their OpenGL drivers. From scratch. None of the years of legacy garbage built up.

If they couldn’t pull off a decent, from-scratch implementation of GL 2.1, then they clearly won’t be able to make a from-scratch GL 3.0. After all, most of ATi’s problems are with the shading language, which isn’t changing for GL 3.0.

Still no response from ATI. I am moving on.

Wow, a whole 4 days. Such perseverance.

The best thing to do is simply address it later, when things are closer to finalized for everything else.

Nope.

But like I said, you are making use of built in uniforms, so start by inspecting them. If these are ok, move on to outputing values from various stages of your shader to the screen.

Also, the way you code can also have an effect. Example, Ogre’s skinning shader causes a crash but other people’s skinning shader doesn’t. The Ogre team deciding to stop short instead of finding a solution.

There’s nowhere to start with this.

Lightmaps are completely black.

Some textures are completely black. Some aren’t.

Lights are backwards.

Even my old FFP program has the OpenGL screen selection broken.

It’s completely fucked. 3DLabs’ ShaderGen doesn’t work, either.

And everything works perfectly on the old drivers.

There’s nowhere to start with this.

There’s always a place to start. It’s more of a question of whether you want to find the problem or just complain about it.

If you want to find the problem, then you will take things to the absolute first principles: a tiny application that renders a triangle with fixed-function. If that works, you then start doing more complicated things, bringing in more of your actual rendering code. When it breaks, you will know that it was that last change that “caused” the breakage.

Whether it was your code’s fault (that somehow nVidia’s drivers ignored) or an ATi driver bug that manifests when you combine features X, Y and Z, you now know where the problem is. What code transforms the renderer from “functional” to “broken”.

Considering that it’s a general failure to render anything of value, I imagine it’s a fairly localized, yet global, thing that you’re doing. Whether it’s your code or a driver bug is undetermined at present.

I’d like to add that, if you’re a “graphics programmer” at a commercial game studio, dealing with hardware incompatibilities and bug work-arounds is at least 50% of your work. It’s the dirty little secret of PC gaming.

Compare to being a painter: perhaps you want to just put paint on a canvas, but you also have to lug your stuff to where the subject is, prepare your palette, clean your tools, … And you only have so much time before the oil dries and you can’t change your mind.

I’d like to add that, if you’re a “graphics programmer” at a commercial game studio, dealing with hardware incompatibilities and bug work-arounds is at least 50% of your work. It’s the dirty little secret of PC gaming.

This is unacceptable, and I have been saying for years that hardware incompatibility and aggressive antipiracy schemes would lead to the downfall of PC gaming. Now that is happening, just as I have said it would.

I just wanted to share this information to confirm that the problem is probably not in Leadwerks’ code.

A colleague has just tested Horde3D on a new notebook with an ATI HD2600 and he said that the samples are completely broken. The skinning is not working at all and the framerate is ridiculously low. Horde definately runs on all newer NVidia cards and I have also tested it on X1300 and X1600 without any problems. There aren’t any shader errors, not even warnings and the shaders also don’t use conditional preprocessor macros.

It is really tragic that the driver quality is that bad for a new card. I’m often asked by people about what graphics card to buy and sadly I have to strictly discourage everyone from getting an ATI card now…

Have they submitted bugs to ATi?

Unfortunately I can’t track down the exact problem location since I don’t have such a card available.

What would be the best way to report such problems to ATI? I had a quick look at their site but couldn’t find any driver feedback.

I’m not sure there’s a “downfall of PC gaming” going on in the first place, but if you think the console world is any better, try writing cross-platform code for the different consoles out there and you might just find that PC is not that bad at all.

Email devrel@ati.com and provide a sample or point to an application that’s broken.

try writing cross-platform code for the different consoles out there and you might just find that PC is not that bad at all.

True.

However to be fair, you can sell quite a lot of games just by targeting only one console platform. It’d be foolish to release a game that was GeForce 8800GT-only. So cross-platform development for console games isn’t as necessary as it is for PC games.

I got a response from ATI’s engineers. Then ATI released version 8.1 of their drivers, which I heard fixes everything. Don’t know if my app runs on ATI correctly yet, haven’t tried.

I have had similar problems with the ATI drivers lately and talked about it in this post. In version 7.10 the “built ins” did not get updated properly. After I upgraded to 7.12 and later 8.1 GL_SELECT didn’t work properly. The latter is more important so I had to go back to 7.10.

Do you guys find that NVIDIA does a better job with their OpenGL drivers. I am thinking of getting a new card, and proper OpenGL support is more important to me than performance.

How did you contact ATI, Leadworks? I am thinking of sending them a sample app with the GL_SELECT problem. Maybe that will get it fixed faster.

Jochen

You bet! Nvidia has great support for OpenGL. If you stay around on this forum long enough you’ll see that the vast majority of bugs come from ATI drivers.

N.

A bit better, but I’m certainly not gonna call it “great” either.

For example, under Vista my 8800 GTX’s driver dies when I run a relief-mapping shader. It compiles/links fine, and it runs great on a 7800 GTX or on ATI cards…

That’s just the most recent example on top of my head, but believe me when I say that NVidia’s drivers aren’t “that” good either.

Y.

You can submit sample apps to them by registering for support and creating a ticket.

I’m interested to see their respone and fix as I’m having the exact same problem (GL_SELECT issues).