GeForce FX vs Radeon 9700 or higher

Originally posted by DaveBaumann:
I thought it was 160 for R300 (?).

It maxes out at 94.

and at least at the current gfFX speed, i don’t want to be forced to use 1024 instr for realtime… as they are even slower than the r300 currently. but i think the techical max should be 2x as fast, so there is hope… still, those 1024 instr, fullscreen, will for sure only work at very low res… too bad…

we’ll see

Originally posted by DaveBaumann:
[b] I thought it was 160 for R300 (?).

[/b]

Well, you only have 64 ALU instructions. But then you also have instructions like texld, dcl, dcl_2d etc (DirectX terms). I’m sure if you max all these out and sum it together you’ll end up with 160 instructions. It’s like the Radeon 8500 were supposed to have 22 instructions I think, but you only really had 16 ALU instruction, 8 for each phase, the rest was texture sampling.

These shader numbers can get confusing real quick when the marketing fluff gets involved. The 9700/9500 can have 32 texture instructions, 64 RGB instructions, and 64 alpha instructions. This is where the 160 comes from. ARB_fragment_program and DX9 typically both specify RGB and alpha instructions together, so this is where you will see a 64 instruction limit. It is possble for something like the following to take only one RGB and one alpha instruction though:

ADD res0.rgb, src0, src1;
RCP res1.a, src2.a;

ARB_fragment program on the 9500/9700 will collapse stuff like this into one RGB and one alpha.

-Evan

Just thought I’d throw in my 2 cents about your original question.

I had been looking to upgrade from my GeForce3 for the past several months and was waiting for the GeforceFX to come out since I’ve always used and been happy with NVIDIA cards in the past. This time, though, I went with the Radeon 9700 Pro. Here are the reasons for my decision:

  1. I don’t like a lot of noise in my computer, and after hearing the FX, I knew it would annoy me.
  2. There wasn’t that big of a performance difference.
  3. Both cards support ARB_Fragment_program, which is why I wanted to upgrade.
  4. I was just tired of waiting. I’d been hearing from NVIDIA that the FX would be out “very soon” for at least the past 6 months, and it just keeps getting pushed back.
  5. I’d heard that ATI was getting better at driver support and have seen them increase their participation on this board (Evan especially).
  6. I wanted to make davepermen happy . He puts in so much work for ATI at all the forums I read…I hope he’s getting some royalties or something

Now that I have the 9700 and have had a chance to experiment a bit with it, here are my thoughts:

  1. It’s fast. Really fast. The fill rate is amazing.
  2. It’s nice and quiet.
  3. No problems with any commercial apps so far.
  4. Development on it is not quite as smooth as with my past NVIDIA cards. It tends not to handle programming errors as gracefully and is more likely to crash windows or freeze up the app if I do something wrong. However, the problems are infrequent so it’s not a big deal.
  5. I just found out that they have more general support for floating point texture formats such as 1, 2, and 3d textures. I believe NVIDIA only supports texture_rectangle. This is very useful to me at work right now.

Hope this helps,
Zeno

Thanks zeno! I’ve been leaning towards a 9700 or a 9500 too since this thread. The FX is kinda out of the question. I doubt i’d ever push it to the max and I don’t play computer games. I may ocassionaly download some demos to check out, but i don’t by them anymore. Programming wise, I think i should be just fine.

What I might just wait for is the R350 chipset to come out. Then the 9700 might become cheaper!! Yeah i definetely do not want to give up TWO expansion slots!!! Not unless the cards are radically better than all other cards in the market. Anyways, I’m not making the next Doom3, so I should be fine with a R300 or a R350.

  • Halcyon

In my opinion, the fact that the FX uses a PCI slot is not a big deal. I have five PCI slots and am using only 1 of them. Not only that, but modern graphics cards should be given a little space to breathe, whether they take that space for themselves or not. You don’t want a card in the first PCI slot blocking airflow to your graphics card’s heat sink/fan unit.

Here’s an off-topic question: Why do all graphics cards have the chip and heat sink on the bottom of the card? Wouldn’t it be better to put it on top so the heat can rise off of it? Is there some spec that states that you can’t have a card take any space above it’s slot?

– Zeno

Originally posted by Zeno:
[b]In my opinion, the fact that the FX uses a PCI slot is not a big deal. I have five PCI slots and am using only 1 of them. Not only that, but modern graphics cards should be given a little space to breathe, whether they take that space for themselves or not. You don’t want a card in the first PCI slot blocking airflow to your graphics card’s heat sink/fan unit.

Here’s an off-topic question: Why do all graphics cards have the chip and heat sink on the bottom of the card? Wouldn’t it be better to put it on top so the heat can rise off of it? Is there some spec that states that you can’t have a card take any space above it’s slot?

– Zeno[/b]

to the first one: i plan on getting a shuttle xpc soon. there i would only have one pci slot, exactly the amount i need for video editing…
and loosing one pci is still loosing one pci (you payed for it but you cannot use it)… and i think its an evolution in the wrong direction. sound systems get onboard, without cooling, with very much power today, network gets onboard, raidcontrolers on board, everything small, clean and quiet. this is the right direction to make a pc worth to buy for an average family. no huge server with noise, small, quiet, but still powerful and fast. gfFX is the wrong direction in this mentology (while it was designed to be cool and quiet, remember the announcement to use .13? )

yes, i think, space behind the card is reserved… you could put on it a cooler yourself, much like the passive cooled radeon (quiet… ahh ), wich has cooling on both sides. i think this is a major issue on the gfFX… very unbalanced cooling…

i would like to get something for my “advertices for the radeon”. its a hard “job”, as everyone got trained to love nvidia over the last years (not without reason, i was nvidia-only-fan for quite some while, too). but things have changed, ati works much bether now, and at least the technical leaders they are since the radeon8500. and technical and speed leader by far, they are since the radeon9700… people forget that very easily, all i want is to open the eyes to realise hey, neighter nvidia, nor ati is perfect, but neighter nvidia, nor ati, is bad.

hm… btw, haven’t had bsod’s anymore since quite some while on my radeon… unlike you…

anyways, i hope things will change for the gfFX, and else i’m currently waiting for the first “real” infos on r350… i hope i can get to the CeBIT, there they should all be visible… and hearable (entering the CeBIT… hm… i think nVIDIA is there on the back left, you think so, too? )

anyways, glad to see the ones with radeon9700 or similar in here happy with their card. this is what we wanna be all in the end, not?

Just a remark about 16-bit being used for cinematic movies.

I think what they meant was that the final image is stored using 16-bit floats. The actual raytracing (and whatnot= calculation is done by the CPU, and AFAIK no CPU that I’ve heard of can do 16-bit float calcs.

Having them would be nice though, more SIMD-power.

Originally posted by macke:
[b]Just a remark about 16-bit being used for cinematic movies.

I think what they meant was that the final image is stored using 16-bit floats. The actual raytracing (and whatnot= calculation is done by the CPU, and AFAIK no CPU that I’ve heard of can do 16-bit float calcs.

Having them would be nice though, more SIMD-power. [/b]

they don’t raytrace, but rasterice. and as far as i know, they have dedicated hardware. and the pixelshading unit is 16bit there, if i got that correct from the nvidia documents…

anyways, 24bits is more than enough for the available instruction count, dunno how much errors you can create on a gfFX if you use 16bit floats, never had them yet to test…

Originally posted by davepermen:
they don’t raytrace, but rasterice…

When they need raytracing some use BMRT , which is renderman compliant. BMRT implements raytracing,radiosity and other GI algos not supported by PRMan.

right… sorry, yes, they partially do, some extended rendermans support it…

that’ll make rather difficult for gpu vendors to really announce they can run cinematic effects realtime in the terms of “we can run shrek realtime”…

anyways. it first has to be proofen the 3 different modes on the gfFX (fixed point 12bit, floating point 16 and 32 bit) are really useful. espencially the 12bit fixed doesn’t sound very useful to me… we’ll see…

Just a little FYI:

Nvidia has improved their FX card since the model that was reviewed. The cooling system does not run during the card’s 2D mode. So that is pretty much one of the most silent cards for 2D. The 3D mode still has the cooling system activated. They claim that the newer cooling system is 5dB quieter than the older cooling system that got reviewed a week or so ago.

  • Halcyon

when ever you post such new facts, links would be great. that sounds very well. a short time ago they stated they don’t work at it anymore and spit it on the marked the way it got tested… but its a great idea to not do so

Originally posted by davepermen:
when ever you post such new facts, links would be great. that sounds very well. a short time ago they stated they don’t work at it anymore and spit it on the marked the way it got tested… but its a great idea to not do so

About half way down this page http://www.hardocp.com/index.html#6462-1

Does this mean every time you create a GL or d3d window, the fan comes on?

This will sound disturbing if you are a developer and are debugging.

Anyway, my PC already sounds like a vaccuum cleaner and so was my older one. That’s one reason I turn up the musac.

This is where i got my info

Sorry about not putting up the website. I was in a hurry to get to class and I just put this up at the lastsecond.

  • Halcyon

and i think its an evolution in the wrong direction. sound systems get onboard, without cooling, with very much power today, network gets onboard, raidcontrolers on board, everything small, clean and quiet. this is the right direction to make a pc worth to buy for an average family.

And therein lies the flaw in your logic.

The GeForce 5800 FX Ultra is not intended for family use; it’s far too powerful for them. They could do just fine with a GeForce4MX.

The 5800Ultra is for gamers and developers. Since gamers and developers already have loud computers (high-end harddrives have fans nowadays), adding one more part to make it even louder isn’t that bad.

Also, note that gamers don’t tend to have a lot of PCI slots in use. Indeed, for maximum cooling, a game-quality computer only had a network card (if it isn’t embedded in the motherboard) and a sound card of some kind. As such, the lack of a PCI slot is nothing to be concerned about.

Lastly, I would point out that the non-Ultra 5800 doesn’t take up the extra slot (though I wouldn’t suggest putting something there either). It, probably, doesn’t make as much noise either. So, if you want nVidia’s features without the noise, get the standard 5800.

The 5800Ultra is for gamers and developers. Since gamers and developers already have loud computers (high-end harddrives have fans nowadays), adding one more part to make it even louder isn’t that bad.[/b]

I don’t think developers and gamers annoyance of noise is any less than the average “family user”. Not every gamers is a overclocking fan and certainly not every developer. Far from every developer values performance the most.
I would consider myself both a gamer and a developer. I have my harddrives in special silence-drive boxes. I bought 5400 rpm drives so they would get too hot inside. I bought a silent CPU-fan with adjustable fan speed to get as little noise as possible.

About the 5800Ultra, even Carmack expressed that it annoyed him, even though he specifically stated that he’s not the kind of guy that tends to be annoyed by load fans.

Originally posted by Korval:
The 5800Ultra is for gamers and developers. Since gamers and developers already have loud computers (high-end harddrives have fans nowadays), adding one more part to make it even louder isn’t that bad.
This is where you’re wrong. Gamers sure are concerned about noise. Look at how popular silent computing has become. I moderate a fairly large forum (5700 members) and there are neverending requests for silent power supplies, CPU HSFs, case fans, hard disks, etc. I’ve seen it over and over, if all else ties up, comparative noise levels are even more important than price.

I can imagine what’s going on though. Some person thinks of a gamer. A teen sitting in a dark room in front of a 19" monitor with a 200 Watts stereo+sub combo, playing Zombie Terror Chainsaw Menace at full volume with a sheepish grin. The pumping heavy metal soundtrack mixes with screams and explosions.
Not so.

Gamers occasionally do respect their spouses and neighbors. Compensating for fan noise with sheer audio volume is just stupid.

And it’s also false to assume that any game, always even produces enough racket to ‘mask’ the PC noise. Heck, some games are almost completely silent, and the little noise they do make is important for gameplay, let alone atmosphere (eg Thief, Splinter Cell, Doom 3 probably too).

Sound is an important aspect of any game, not some unrelated utility to help you ignore your various cooling systems.


That being said, the NV30 in itself looks pretty attractive. I’m just not going to tolerate anything that’s noisier than my case fan setup.