FX 5200

OK this is totally OT, but anywayZ, I’m tired of all this NV30 emu stuff. Is it worth to buy 5200 if I have GF2GTS now. I guess there should be support for ARB_fragment_program, but there is definately no sense to go for 5800 Ultra or 9800, to expensive, and I guess they’re going to change soon.
5200 is crap, but is it totall crap? That’s the question

You get the full NV30 feature set, at GeForce4-like performance levels. You could certainly do a lot worse for what this card would cost you, especially if you’re coming from a GF2.

– Tom

I have a choice GF3 Ti200 & FX 5200, which I should go for, price is identical?
GF2GTS w softvare vp is a totall miss, I just can’t keep going with it further.

Originally posted by M/\dm/
:
I have a choice GF3 Ti200 & FX 5200, which I should go for, price is identical?

If those are your options, you should OBVIOUSLY go for the 5200, which will be about twice as fast as the GF3 and will have a MUCH larger feature set (identical to that of the 5800). If you get a GF3, you’ll still be stuck with NV30 emulation, because it doesn’t support GL_ARB_fragment_program.

– Tom

No, it’s not obvious. Expecting twice Gf3Ti200 performance is just folly. The 5200’s performance isn’t all that hot, as you can see in the various reviews around the web. Take this one for instance.

Note that they’ve tested the 5200Ultra. The only benchmark where it really scores is the (Quake 3 engine based) Jedi Knight II. As I’ve taken it, you’re looking at a 5200 non-Ultra.

If you want the features, okay. But don’t expect a good performer.

I have Ultra for the same price as gf3, but what’s the difference in design 8x1, 4x2 has/hasn’t fp32 etc. Performance doesn’t metters right now, as there should be new era after NV35 and goodbye to AGP (at least I hope).

I was unaware that the 5200 was as slow as it is (I have a tendency to ignore the benchmarks when reading reviews), but even so I would say it’s still an obvious choice when compared to a GF3 Ti200. Maybe less so for a gamer than for a programmer, but since this is “Coding: Advanced” I assume M/\dm/
falls into the latter category

– Tom

The 5200 is a 4x1 architecture according to my own benchmarks (which also clearly identify the NV30 as 4x2 with ‘double pumped’ Z units btw).

As I’ve said, if you’re in for the features, go for it, I won’t argue with that.

But it won’t outperform the Gf3 across the board - it technically can’t.

OK, it seems that all the programmability is realy up to FX + NVIDIA’s developer support seems to be better than ATi’s. CG aginst RM, developer.nvidia.com against developer.matrox.com | developer.ati.com
BTW, I found 5200 dosn’t have Z compression. I hope there is no surprises like that at fp instruction count etc. Or maybe I should be parsing the stores for FX5600?

You’re dead set on an nvidia card then?
ATI’s drivers seem pretty solid these days - so I would have thought the choice is a lot clearer now.

Originally posted by M/\dm/
:
OK, it seems that all the programmability is realy up to FX + NVIDIA’s developer support seems to be better than ATi’s. CG aginst RM, developer.nvidia.com against developer.matrox.com | developer.ati.com

I wouldn’t let this guide my decision. You can use Cg and RenderMonkey on both NVIDIA and ATI cards, and nothing stops you from visiting the other vendors’ websites.

– Tom

i would go with an ati card. pleasing you as developer with advanced features, you as gamer with great performance, and you as artist with brilliant image quality.

and you as simple pc user with great stability and reliability.

and, you as opengl user, with fast and good support for extensions like ARB_vp and ARB_fp. the second one only lazzy and slow supported by nvidia. (or, with bad image quality under the lowest quality standard. what ever you choose )

but, most important. for you, as the one who pays the money, you get much more for your money from an ati card.

i would have said different a year ago. but today, at the very moment, ati has the bether cards.

and their cards fit nicely into opengl. something i never expirienced on nvidia cards over the last years actually.

Originally posted by davepermen:
pleasing you as developer with advanced features, you as gamer with great performance, and you as artist with brilliant image quality.

Ever thought of a career in marketing Dave?

Buy a 9500Pro. Buy one now, while they still can be found (as they are being phased out by the 9600Pro, which is a bit slower, but cheaper to make).

Gffx are running shaders slow compared to ati radeon 9500/9700. Ati cards have very good shader hw but they have problems such as stuttering, low refresh rate on win98/se, lines running on screen, smartgart not working with fastwrites all the time, driver installation errors, heatsink shim defective allowing improper heat transfer(9700), and some others I forgot. If you don’t want hassles then buy nvidia otherwise go over to rage3d forums and investigate first to see if you will have similar problems as some folks there.

I’m in the same situation as you and Ati scares me. The bad thing is that right now there isn’t a good choice for us unless you want to buy ati’s 9800pro which works great I read. It fixed that shim issue and some internal gpu issues I think. It has that f-buffer for unlimited fragment programs as well as I recall. Not needed of course since we want to run in realtime not offline. I’m going to wait for nv35 and then see whether I go with nv or ati. By then I’m hoping ati will fix their driver bugs or improves on new driver testing system now that they got some cash from their r300 cash cow

Do check out [H]ocp site for gffx reviews and pay special attention to shadermark test app. Nvidia hw is 5x+ times slower than ati when it comes to ver.2 shaders. The gffx 5200 also lacks fillrate so pixel lights/stencil shadows will really hurt you. It’s hurting my gf2 now and 5200 card is not that far from gf2 or gf4 mx400(gf2 in disguise).

Originally posted by dorbie:

Ever thought of a career in marketing Dave?

hehe just repeating what about all hw test sites tell ya.

about the one above who sais if you want stable things, get nvidia stuff. thats true for pre-FX stuff. but not for fx. they haven’t actually brought up one driver wich really works the way nvidia claims it should.

and the bug with the disabled fan in screensavers is dangerous. (running 3d mode, but no fan enabled. gets hot, up till it possibly burns your pc).

anyways. gfFX hw is not something good at all. 140°C in my pc or not sais enough to me.

Yes, the ultra is totall miss. I was shocked when read http://www.nordichardware.com/reviews/graphiccard/2003/Gainward_FX_5800_Ultra/index.php I think there’s no better review that this
But anyway, I’m not going to ATI either, I’ve seen their cards, drivers etc. Sorry ther’s way to go, although r350 is almost there. And I’m not going after 9500, think about instruction count limit, that should be enough. I think FX5200 Ultra has everything I need to write programs for future hw & that will support future shading languages + 128bit color ATI has on their R350 series only. The question is only about the limitations of 5200U/5600. Is 5200 enough?

BTW, “140deg”, keep smiling, you don’t need boiler, and also you can skip few trips to kichen after tea cup