OT: GeForce3 slow?


I just got a new GeForce3 Ti200. The fillrate compared to a GF2MX is really amazing but I’m wondering about the triangles per second rate. I was trying out NVidia’s Benmark5 ( http://developer.nvidia.com/view.asp?IO=BenMark5 ) and I just got 17 million triangles per second. That’s not much, I think that my old MX even got 24 millions.

Hmm, I think that must be a problem with AGP. Is there a possibility to see in windows xp if 4x AGP is enabled for the card? I remember that I had a similar problem with the gf2mx where I had to enable agp manually in the driver settings but I don’t know how to do that in xp with the standard NVidia drivers.
Or could someone with a GF3Ti200 please try this BenMark5 and tell me how much tris/sec he gets?

Thanks a lot in advance and sorry for posting an off-topic question


Tested on a GF2 MX/MX 400 and I get almost exactly 16million (a smidgeon less on average). I’ll test on a GF3 if I remember when I get home.

Doesn’t the CPU & other system specs also play a role in this? (not to mention what screen res, bitdepth, and all that good stuff is)

Don’t worry about it. Neither of your GeForces will likely reach that triangle throughput. In real-world conditions, your GeForce 3 will come out ahead, since it has less of a bandwidth bottleneck.

Korval, the guy has a GF3, he wants a test not an opinion, I know the feeling because I’ve had issues with AGP 4X in the past :-).

Elixer, check out the test.

[This message has been edited by dorbie (edited 05-13-2002).]

I get almost exactly the same result with the GF3, it’s not a Ti 200, just an original GF3, but both the GF2 MX400 and the GF3 seem to get almost exactly 16M tris with this test.

What CPU is that on Dorbie ? I get 20M on Radeon 8500 ( P3 500, AGP2x ).

I’ll try it on a standard GeForce3 ( Athlon 1400 AGP4x ) tomorrow.

I ran the test on the GeForce3 setup and got 16.4M …

[This message has been edited by PH (edited 05-13-2002).]

BTW, I had problems with AGP 2x vs 4x on 98SE, now I’m on XP it seems to be reporting the 4x I set in my bios. I use the free version of the SiSoft Sandra benchmark utilities to check my system. I also use wcpuid3 which tells me the command mode of AGP I’m in instead ow what’s supported. It reports 4X but says fast writes and side band addressing are disabled. Hmmm… let me check my bios here.

PH, I tried the GF2 MX on an 800MHz PIII and the GF3 on a 1900+ Athlon. CPU is not the issue in this benchmark. I’m going to check if I have fast writes disabled in my bios, I’ll run again and get back to you. Your GF3 results are in line with mine, just a smidgeon over 16M/sec.

[This message has been edited by dorbie (edited 05-13-2002).]

I have fastwrites enabled on the Athlon system ( at least that’s what I specified in the BIOS. My P3 unfortunately doesn’t support fastwrites.

What about the AGP aperture size - would that be an issue in this benchmark ? It gave quite a boost in the CodeCreatures benchmark ( I changed the settings from 64 to 256 MB ). Probably due to the large number of textures … ?

Well I looked and fast writes are enabled in the bios. The WCPUID chipset utility reports it as supported but disabled. More annoying chipset/driver/graphics card quirks.
I’m sure I checked this a while back and it was reported as enabled.

Shouldn’t I don’t think, there’s not a lot being drawn.

PH, could yo see what wcpuid3 reports in the chipset section for fast writes on your Athlon system?

Hmmm… all disabled. Even my P3 has side band addressing enabled.

That’s what wcpuid3 reported ( though it also said it was supported ).

I remember something about this having to be enabling in the registry for GeForce3’s ( I think ).

[This message has been edited by PH (edited 05-13-2002).]

I benchmarked my two systems., this is what i get :

Amd 1300 GF2 mx 400 i got around 19M.
Amd 800 Gf3 , i got around 27M.


Puh, so my GeForce3 isn’t slower than all its brothers out there…

Thanks for testing!!

I think you posted too soon labas :slight_smile: You missed the last post.

I think we need fast writes enabled. I wasn’t quite trusting of my reporting tool, but the results speak for themselves. Something is wrong.

Bruno, what drivers?

I just read about this the other day when I noticed FW/SBA disabled even though I’d enabled them in the bios.

Supposedly, even if enabled in the motherboard bios, most (retail/OEM?) NVidia cards don’t support this by default because it can reduce stability. The solution is to flash the vid card bios, but this comes with warnings, since it’s a risky process. I didn’t go into much depth reading about the process and I haven’t had the time to think about doing it myself, but I will in the next few weeks probably.

I found a tool just now via google that may help. Haven’t read about it much either but there seems to be a fair bit of information:

If anyone is game to try this, let us know how it goes.

Hope that helps.

Using the default 1024x768x16, and with fastwrites on, sideband disabled using 28.32 drivers I get 22.64 M tri/sec on a Duron 800 + gf2.

Using 640x480x16 I get 22.79M tri/sec, and for a laugh, I tried 320x200x16 and I get 22.97 M tri/sec.

Oh, if you want to play around with fastwrite and SBA, goto http://www.geforcetweak.com/ They have a util to toggle lots of options, works with latest drivers.
Also, I think ffish is correct, SBA is disabled by the bios of the vid card, since it caused more problems than speed gains.

On the GF3 machine i’m using 28.32 drivers.
Fastwrites and AGP4x is on, and the last VIA drivers are used.
On the GF2 machine, i don’t remember exactly which drivers was, but it was from this month, 29.xx something.

Try to update your VIA, maybe you get lucky