[OT]: Upgrading video card, would like opinions...

First, I don’t want to get into any debate on driver quality, as I feel that is for the rest of you to decide/argue about.

I do want to ask people who have problems to file them with the appropriate IHV. I am sure that all IHV’s participating on this board will agree with me on that. Everyone on this dev board can file issues with ATI products at devrel@ati.com.

-Evan

Originally posted by Humus:
Why are your problems more relevant?

Because I’ve tried the radeons on more than a single machine, humus - dorbie’s talking about problems getting his nvidia card to work with a single motherboard. That is why my ‘experiences’ are more relevant.

ehart - your drivers blue screen on lots of different configurations - sorry, but I don’t have the time to give you the exact conditions these blue screens happen under - unless you’ve started some kind of paid beta testing scheme for the people who buy your cards.
One thing, try changing fillmode in the bump mapping shader of rendermonkey to points, on a radeon 8500 + abit kt7a + athlon 1.2ghz.
That’s something for you to get started on.

Knackered,

Have you maybe considered that the replacement 8500 you got, is faulty too ( my first 8500 was broken too ) ?

I don’t get random blue screens. In fact my system has never been as stable as it is now. Have you looked into what BIOS version is on the card ? Have you used a utility to make sure the card has not overclocked by mistake ( not sure this can happen but overclocking can be done with a simple utility, so maybe ) ?

As for not reporting bugs because ATI doesn’t pay you, does your company not care about getting their software to run on the most hardware ?

I’ll repeat my point, the 8500 in my system is rock solid. The “correctness bugs” are minor ( and will be fixed ). If the 9700 is anything like the 8500 in terms of drivers, then I would recommend it immediately.

Ok - maybe its faulty. My apologies to ATI’s driver department - my eyes now fall on the hardware manufacturers.
You sure you haven’t had any stability problems, PH? What motherboard/cpu are you running with? What OS?

Originally posted by knackered:
[b] Because I’ve tried the radeons on more than a single machine, humus - dorbie’s talking about problems getting his nvidia card to work with a single motherboard. That is why my ‘experiences’ are more relevant.

ehart - your drivers blue screen on lots of different configurations - sorry, but I don’t have the time to give you the exact conditions these blue screens happen under - unless you’ve started some kind of paid beta testing scheme for the people who buy your cards.
One thing, try changing fillmode in the bump mapping shader of rendermonkey to points, on a radeon 8500 + abit kt7a + athlon 1.2ghz.
That’s something for you to get started on.[/b]

If that means anything I can start to pull the story of my friend of mine who’s GF would cause random spontaneous reboots and lockups every 15 minutes or so. (It was kinda amusing sometimes seeing him go up and down like a yo-yo on my ICQ contact list ) Now moving it over to my roommates computer didn’t help, well, it would a run little longer, like 30min before rebooting or freezeing. We did a lot of stuff to that card, but it would never run stable. It might just have been a faulty part, but at that time I heard a lot of similar reports from other people on the net.

I may also pull the story of another friend of mine, we helped building him a system with a GF2 pro. It sucked in every way possible. Games would either not work, crash or have severe image quality problem. Ut ran at 6fps. Quake was fast, but showed severe banding even though we turned TC off and used 32bit for everything. The screen was very blurry at certain refreshrates, looked fine at 60 and 85 but not at 75 in some resolutions … other rules applied to other resolutions … etc. Yes, we tried it on another machine, same problems. Put a Voodoo5 in that machine, worked just fine …

I also find your attitude on reporting bugs quite irrational. How do you expect to ATi to be able to fix bugs if you don’t report them? It’s impossible to fix bugs you don’t know and can’t reproduce. Also, have you considered that your problems may be application problems? I assume the apps you’ve tested are those you’ve developed yourself. It certainly have happend a lot of times for me, my faulty application ran just fine on Radeon, but not on other vendors hardware. My GameEngine demo I released early this year is a good example. It took several revisions before I got it up and running on GF3’s too. The problem was my app, not anyones driver, well, except that once the problems were fixed there remained a driver problem causing GF3/4 card to produce random output if anisotropic was enabled, which btw took nVidia like 4 months to fix. I might just aswell while I’m on it complain about nVidia’s lousy developer support. They would not even reply on email when I reported bugs. ATi always does, fix problems quickly, and keeps you updated.
Also, some vendors bugs may only show up on other vendors cards (nVidia + GL_CLAMP … ). Some vendors also refuse to fix such bugs …

You sure you haven’t had any stability problems, PH? What motherboard/cpu are you running with? What OS?

Yes, everything runs great. This is my system,

  • Built-by-ATI Radeon 8500 ( 6143 drivers, BIOS 1.004 )
  • AthlonXP 1800+
  • ASUS A7V266-E motherboard ( has the improved VIA chipset )
  • 768 MB DDR ( don’t remember the name, but it’s not a ‘noname’ )
  • DVD drive is from ASUS ( E616 )
  • HD is an 80GB Maxtor ( fluid )
  • Windows 2000 SP3 ( worked fine with SP2 but I usually apply these updates )
  • The sound chip is on the motherboard ( works fine for what I need ).

Actually, I really like stuff from ASUS. My GeForce3 is from them ( the main reason to continue buying their products ). I’ve had too many problems with noname products that I don’t mind spending a bit more for quality.

I really hope you solve your problems, since you are missing out on some really great hardware .

[This message has been edited by PH (edited 09-11-2002).]

Humus, are you speed reading my posts?
You seem to be missing crucial bits of imformation that would have saved you a lot of typing.
Do you really think I’ve got the time to build the PC’s we use for our simulators?
We use Fugitsus, SGI’s, Intergraph ZX10’s, HP’s and Dells. These, in case you weren’t aware, are top quality, optimised builds. I’ve also tried both radeons on my home machine, which I built myself. Same results.
I know you write ATI specific applications, humus, so I can understand your passionate defense of their cards. As I keep saying, I’d love the radeon to be stable, as it’s a well designed card, but it just does not seem to be.
It also seems to be required that I repeat what I have said in previous posts - these are not bugs in my own apps (although the lockups do happen in them too), they are most evident in little known apps such as 3dsMax4, 3dmark2001 and ATI’s very own RenderMonkey.
Now please stop flaming me!

Alright, let’s get this on the right track before this degenerates into a big poop-slinger. Both ATI and nVidia need better ways to inform developers on the state of their drivers. This is what I suggest, (if anyone listens is another matter).

  • both companies should have Bugzilla databases set up
  • make the databases readable to anyone, so that developers know which bugs exist and can build workarounds
  • give write access to certain developers out in the community to submit bugs so that the databases don’t get out of hand (prevent problems that the mozilla project sometimes encounters)

Will this ever happen? Probably not. Both companies fight tooth and nail for market share and probably don’t want this to effect sales. But it is the Right Thing To Do™. And it would make development much smoother.

knackered, take me and some of my friends to the list of people with huge problems with geforces and drivers. currently its quite stable, but it took me about 3 years to get an nvidia gpu working quite stable… on this pc. if i try on winxp, for example, it would **** up blue screen again… others are having same problems…

about the “carmack couldn’t even get the console properly some months ago”. the r300 is the card with wich doom3 was presented on e3. i haven’t seen any blue screen on the movie that is not allowed to be seen at all… and carmack just loves the gpu, sais its awesome, works like a charm, and the drivers are good. same for the 8500, more or less. but there is no statement about problems AT_ALL with the 9700. thats different to the 8500, wich had problems in the past.

no gpu works on every pc perfect. sometimes there is just some tiny thing wrong that ****s it up. nothing is perfect. you had a bad happening with a very visible faulty ati gpu. so what? i know of dudes with the 8500. they don’t have such problems. take it as a fact, both companies have good drivers, no company has perfect drivers.

>>When multi-pass texturing how many texture layers are the maximum you would put down to get the right effect on a poly? Right now I’m stuck with a card that supports only 2 texture channels, so I end up making 2 – 3 passes depending on the effects used. The problem is that my pipeline (due to the math involved) may still need to be split in to at least 2 passes on a card that supports 8+ channels (I’ll still end up using only 4).<<

it depends, in generic opengl you get 8 textures, i think. and thats it. or 6. when you use the new pixelshaders, you can sample up to 16 times from the textures, and calculate up to 64 instructions (full floatingpoint). and all interchanging… so the amount of textures, and that, does not really count anymore… its all mixing up to a generic shader. cute it is…

Originally posted by knackered:
[b]Humus, are you speed reading my posts?
You seem to be missing crucial bits of imformation that would have saved you a lot of typing.
Do you really think I’ve got the time to build the PC’s we use for our simulators?
We use Fugitsus, SGI’s, Intergraph ZX10’s, HP’s and Dells. These, in case you weren’t aware, are top quality, optimised builds. I’ve also tried both radeons on my home machine, which I built myself. Same results.

<snip>

It also seems to be required that I repeat what I have said in previous posts - these are not bugs in my own apps (although the lockups do happen in them too), they are most evident in little known apps such as 3dsMax4, 3dmark2001 and ATI’s very own RenderMonkey.
Now please stop flaming me![/b]

You never mentioned exactly what apps you were talking about. I don’t what kind of work you do, so “our simulators” certainly made it sound like you were developing some kind of simulators which would not run on the Radeons. Under these assumptions I think it’s fairly reasonable for me to point out that it might just aswell be application errors.
Regardless, I think you’re making some serious bad generalisations. Just because you have had bad luck with ATi card doesn’t make them suck. That fact is that most people run their Radeons without problems. I’ve had nothing but bad experiences with nVidia cards, do I post claims about nVidia drivers suckyness? Nope. I assume I’ve had bad luck as most people don’t have any problems with their Geforces. The only time I comment on nVidia driver quality is when someone raised their driver developers to the sky with word like “golden standard” etc, under which conditions I think leveling the field a little may be needed.

Originally posted by knackered:
I know you write ATI specific applications, humus, so I can understand your passionate defense of their cards. As I keep saying, I’d love the radeon to be stable, as it’s a well designed card, but it just does not seem to be.

The reason some of my applications are ATi specific is not because of some special passoinate love of ATi, it’s purely because I currently own an ATi card. I try to develop for as wide variety of graphic cards as possible, and prefer to use ARB extensions over any vendor specific. After all, I want as many people as possible to be able to run my demos. I have no interest in giving a particular vendor an advantage. Sometimes I just need special features only available as GL_ATI, or find them very interesting. So some demos obviously end up as ATi-specific for that reason. If I would have an nVidia card I’d have some nVidia specific demos too. There are some features on the GF3/4 I’d love to make a demo of, for instance shadow mapping, but can’t since I as a not-so-wealthy student can’t justify buying another graphic card.

Originally posted by john_at_kbs_is:
[b]Another deciding factor:

The r300 has 8-texture channels right? How many will the nv30 have?[/b]

16, only 8 texture coords though (I find that a little screwy, but I can live with it). Same on the NV30 (last I checked anyway).

Err no, my comments do not stem from a single experience. That was just an example to try and illustrate what I am saying. Knackered, all you’ve done is illustrate that you can’t build a working PC. The kind of thing hardware reviewers do regularly and post the results all the time. Most of the driver related issues w.r.t. ATI have been functional issues and ATI have made dramatic improvements (see Carmack’s comments), the driver issues are not BSOD under mundane conditions. Maybe it’s your BIOS settings, some other PCI card you use or EMI from your monitor & bad shielding, I don’t care. Just quit pretending your anecdotes of inexperienced PC building are indications of ATI’s driver incompetence.

Just a quick comment ( slightly OT perhaps ), I ran the UT2003 benchmark on my 8500 and got some surprising results,

Flyby: 113 fps
Botmatch: 45 fps
at 1024x768

No graphical glitches, no crashes. I also tried the Battlefield 1942 demo, that works for me too ( lots of people seem to have problems with it ).

I can’t see what’s surprising there … did you expect higher/lower?

I was expecting lower for such a new game. I wasn’t expecting any problems though.

since they dont have a lounge forum here ill post here
ive seen that demo’s out ‘UT2003’.
Q/ is it worthdownloading from a gamedevelopers point of view?, or is it just quake3 with better textures/physics (ie not improved in the graphics technique department, stencil shadows etc)

i ask cause i have a very slow conection + 100mb will take at least a week to download

It looks great ( of course ) but it’s too fast for me . I’m not sure if there’s anything special from a developers point of view, in that case nothing comes close to the Quake games. You’ll need the editor to open the unreal archives but it’s not included in the demo.
Did I mention it looks great ?

Anyway, no stencil shadows. There are some projected shadows textures ( rotating fan ) and the lightmaps appear to be of very high quality. It looks good but it’s not anything special like DOOM 3…so you’re probably correct when you say “Quake3 with better graphics”.

The most important aspect of the ut2003 demo is of course that it’s fun. As a developer the most interesting thing about it I guess is the editor (well, it’s not in the demo but in the full game).

Not to start a new argument, but I prefer Unreal to ALL Quake games. When the first Unreal came out it had an impact on me that I hadn’t felt since the original Doom. I know everyone in this forum is all over Carmack and this is basically blasphemy, but Id games always go for graphics over playability. Q3 looked impressive and also sucked…

This however is just my opinion…

John.

Just checked out the demo, very nice… Nice water, just a shame that it only reflects a static image. I like…

John.