Number of textures in OpenGL

I currently have a via based motherboard, and its stable as hell with my gf4.

Of course my next mobo will most likely be an nvidia chipset…

Originally posted by jwatte:
[BIf you have a VIA based motherboard that’s based on KT333 or newer, it’s likely that the problem is with the VIA motherboard, not with the Radeon graphics card.

We’ve seen a lot of this, and that’s the diagnosis we’ve arrived at. My next computer’s an Intel! (unless those Athlon64s REALLY manage to convince me otherwise)[/b]

I actually have no clue what my mother bord is, except is a dual CPU. Both if iwas the mother board, I assume it would freeze no matter what graphic card is installed. But that problem is only with the Radeon 9800, so I assume it’s the board (even the 9700 is ok).

My next computer’s an Intel! (unless those Athlon64s REALLY manage to convince me otherwise)

Slightly OT, but you can get an nForce-based motherboard for Athlons. Many people consider them superior to Via chipsets.

AFAIK the nForce chipsets are only for Athlon chips. I have an nForce2 mobo with an Athlon XP 2600 right now with a GeForce 4 Ti 4400 in it and everything is prefectly stable…and fast might I add.

-SirKnight

AFAIK the nForce chipsets are only for Athlon chips.

The point I was making is that he doesn’t have to switch to Intel processors just for a more stable/higher quality motherboard.

well. long stories short end:

nothings perfect.

and, because we where all nvIdiots, we are still trained to
“you have an ati card and bugs? its the ati drivers”
en contraire to
“you have an nvidia card and bugs? your system is a bad setup”

this was true for a long time.

today we can say that BOTH run rockstable in a rockstable system, and BOTH can get rather instable in bad configured systems. i have crashing systems with nvidia and ati. but only in bad systems.

i never got any crash for example compaq/hp pc’s at work, no mather what mess i tried.

and yes, there is not much good about the gfFX wich anybody cares anymore. it has one big problem: it performs bad in dx9 code, or ARBfp, and it has some missing features wich are important for real futurestyle effects programming. real float texture support for example.

well well…

i go to sleep now. night.

The only problem with FX is that you must learn to write program in a new style, otherwise FX=~=R

well, thats the “only” problem for developers. for gamers its the problem that the card will never perform well for anything except topseller games. wich is nice for topseller games, but for all smaller things it isn’t.

yeah, GlideFX rocks. a mix between opengl extensions, cg, huge tool/sdk packages.

fun

A mix of opengl extensions, cg etc…

what?? Cg has nothing to do with the performance of the GF-FX. It’s a tool to write shaders more simply. If you dont like its output, you dont have to use it. Simple as that.

There isn’t much difference between NV_FP and ARB_FP, except the precision issues, which dont account for the performance differences, that much. It’s all to do with register usage apparently… Hopefully Det50 will alleviate this problem somewhat…

you haven’t informed yourself well about the extensions, and about the hw actually.

yes, register usage is a main point, but its not at all all you have to take care. cg is about needed to get good performance, and is a proprietary addon over the nv extensions => part of it.

Actually, NV_fp is much more powerful than ARB_fp, because NV_fp has derivatives “for free” at any point, and also supports predication (“only do this instruction if this status bit is set”).

I really wish ARB_fp had the same smarts. Of course, once the R400 is available, it’ll probably leapfrog the NV35, and then we’ll all wait to see how the NV40 performs :slight_smile:

Originally posted by davepermen:
[cg] a proprietary addon over the nv extensions => part of it.

At least it makes it possible to write shaders that can run quite fast, and with more hardware support than arb_fp

and is a proprietary addon over the nv extensions => part of it.

Its nothing of the sort. You can use Cg, without ever having to use any NV extensions at all… I dont understand why you criticise it soo much, at least they got off their arses, and actually made a working HLSL compiler that works right now, some games are even shipping with it. Can that be said for the GL2 shading language? No it cant.

If ATI had produced it, you’d probably think its the best thing since sliced bread.

no, definitely not. at least not if it would be an ati proprietary marketing tool only.

sure cg is not officially there to work for gfFX, but they work very hard on making people believe cg and gfFX are one. i see tons of people asking what gfFX they should buy to develop for cg, and tons of people who buy gfFX and don’t even know you “could” work without cg for it. and all of those are newbies 16 upwards.

sure, cg is not needed and not nvidia only. but its a psychological need they want to built with marketing.

bah, nvidias marketing is crap anyways. they put dawn onto the gfFX 5200 package… i want to see at wich res you can watch her smooth

Originally posted by davepermen:
i see tons of people asking what gfFX they should buy to develop for cg[…]

Indeed…
The very thing I like about Cg is that it is not limited to a few cards.

They put dawn onto the gfFX 5200 package… i want to see at wich res you can watch her smooth

Very slowly. Strangely, it works at the same slow rate, whatever the resolution is. I do not see much imporvement (or none) when reducing the resolution.

At least I do not need a hacked naked version to make it run on a ATI

EDIT : presentation/quoting fix

[This message has been edited by tfpsly (edited 09-08-2003).]

hehe, but the hack rocks i can at least run it smooth at every res… independend on res, too, yep

Originally posted by tfpsly:
At least I do not need a hacked naked version to make it run on a ATI

Actually, it works perfectly fine with her clothed, on ATI cards. . .