geforce3 and cost

Matt starts to scare me.

Originally posted by mcraighead:
[b]Then again, if MS wanted to withhold Windows from an OEM that threatened to use another OS, I would see nothing wrong with that.

  • Matt[/b]

I don’t know about american laws but AFAIK at least here in Sweden it would be illegal to do that, possibly the same over there.

Ah, but as I’ve said before, I’m opposed to all antitrust laws.

  • Matt

Originally posted by Humus:
I don’t know about american laws but AFAIK at least here in Sweden it would be illegal to do that, possibly the same over there.

they did this to ibm…
with the whole Office Deal and limiting selling them licenses…

it’s in the testimony…

What’s to prevent OEM’s or IBM from going down to best buy and picking up whatever MS products they want? I’m sure it’d cost more than their sweet volume discounts, but I don’t think there’s anything MS could do to stop it.

– Zeno

Anyone hear of any of the theories surrounding MS buying out Corel to try to stop Corel linux eating into the ‘windoze idiots’ market share.
There was also a suggestion that MS would make a Linux distribution! strange but what if it was true! , next we will be buying MS TVs, eating ms pizzas cooked in an ms oven in a house made buy ms in an extire city run and created by MS.
Who’s idiot idea was it to split MS up, now it is just harder to destroy them, if they are one then you could nuke them ‘not literaly !’ and get rid of the lot. Now they will keep splitting and it wont be long before MS hardware division is created, we already have ms mice,keyboards, joysticks and pads. The new world order, noooo, lets escape to the mountains (and hope the FBI/whoever don’t bother us)

[This message has been edited by Tim Stirling (edited 05-26-2001).]

anyone knows the story of “big brother is watching you” ? ( not the tv-fun, the original… )

big billy…

Too much Deus Ex, right ? )
With all those conspiracies, and such…

xbox smecksbox… why would i ever want one?
OK if they release an OpenGL dev kit/driver set for it then sure - much more interesting, but then wont that be the same as my PC? Actually probably less than my PC (Dual 1Ghz) [+ GF4 by the time it comes out in the UK]

I’m borrowing a PS2 atm and i kinda like it - sure the games are a bit ropey atm but GT3 is great (played it @milia). Also i think the XB is too large for my living room. Apparently the Japanese at TGS were a bit confused by its size also - some even commented on it as being more like a tea table than a console.

I’m just not sure that the XBOX will take off really, and everyone wants to back the winning horse.

Anyway, theres my two pence worth.

(BTW i would love a GF3, but i think ive still got a lot of mileage to go with my GF1 yet )

At the risk of dragging out yet another wildly off topic thread…

Originally posted by Humus:
I don’t know about american laws but AFAIK at least here in Sweden it would be illegal to do that, possibly the same over there.

Matt said he would see nothing “wrong” with it.

What is right and wrong, and what is legal and illegal, are often two very different things.

And on this I agree with Matt. And no, I am not a fan of Microsoft (I run Linux exclusively at home). Rather, I’m a fan of liberty.

My Cdn$0.02

Well, I didn’t mention anything as right or wrong, I just pointed out that it was illegal. But I do think it’s wrong to do that though and fully support such laws even though I understand that there may be other people that don’t share my views or opinions.

[This message has been edited by Humus (edited 05-27-2001).]

Size matters, the smaller the better! The XBox is just to big, that why I like the GameCube- size of 10 cd cases, sits on the palm of your hand and yet still has the same power as the xbox. Also the GameCube will have THE best games.
Time to start a new thread Intel v AMD…

Did you know that even although AMD is doing very well with ever increasing sales there sales figures are tiny compared to Intel.

Heard about the rumours that Intel and MS are basicly the same company run together but carefuly shown to be 2 seperate companies.

Who will win the 64bit war, AMDs early x86 64bit or Intels later first non-x86 CPU for years and years. What about the supply of MoBos for these. How much of a difference will 64 bit technology make???

though amd is still a small fish compared to intel they certainly have picked up a bit in the last year and can’t really be considered as tiny.
>>For chip manufacturer AMD, the year 2000 meant a surge in sales due to an immensely popular processor, the Athlon. Between the processors sold early in 2000 based on the K75 core, and those sold in the second half of 2000 based on the Thunderbird core, the Athlon has achieved a 24% unit share, bringing AMD’s total desktop share to 38%. Not bad for a company that previously struggled for a waning position on the value market - not bad at all.
So how has AMD emerged from 2000 in such a successful position? Not only have they matched and surpassed the performance of Intel’s Pentium III platform on a clock for clock basis, but you can pick up an Athlon at 1.2GHz for about $50 less than Intel’s Pentium III 1GHz. Value and performance made AMD the fastest growing CPU manufacturer last year.<<

alll i can say about ms is they’re doing just what any other company would do in there place. theyre all dishonest looking at the xbox + the ‘pc like games’ that will come with it. ive got the feeling ms have made a big mistake that is gonna end up costing them billions. after seeing what a few of the nintendo games (on the web) looked like my moneys on them in the battle of the 3.

Does anyone know an OpenGL related discussion forum ?

Originally posted by Zeno:
[b]That’s true, but it would be slower, wouldn’t have usable anti-aliasing, wouldn’t have hardware support for vertex programs, texture shaders, or depth buffer based shadows. Oh, and I’d have to upgrade it sooner.

I am a developer after all, and this new stuff is fun

– Zeno[/b]

Hi Zeno,
about your article about the Geforce 3 and the matter that you are software-developer. I am software-developer as well, but not at least in my craziest dreams I would buy me a Geforce3 because of this, not at this simply perverse price for nothing. Yes, anti-aliasing is surely looking better, but a simple blur is not Antialiasing, but just a fake, which is even negative in a couple of games such as Counterstrike, in which you really need to look for every pixel, cuz it could be an enemy. The Geforce3 offers in my opinion surely a lot of possibilities, but to come to the matter that you are software-developer as well. What a market-value do you think does the matter, that your game or what ever is supporting the Geforce 3 have? I monthly get a statistic, which cards are bought and supported by the mass and the Geforce 3 is not at least listed, means it’s under one percent, I would already be surprised, if it had just 0.1% and this will also not change for the next months. Even NVidia all in all doesn’t own the market. The mass has got Intel- and ATI-chips, there are a lot of TNTs in use and still a lot of Voodoos as well. Slowly the Geforce 1 is becoming more and more a standard and soon it will be the Geforce 2 MX, but else there’s no value I see in the Geforce 3 at the moment. It’s surely fast and it can surely render a lot of vertices per second, but you should never forget that these vertices also need to be calculated first. Till a Geforce 3 will really make a sense to buy, the processor need to be faster, the AGP-port anyway and the price has to be at around 150$, then it’ll slowly get importance and till this date I will also not waste my time for Pixelshaders or what ever. And I am sure if there wouldn’t any money as some hardware would float from NVidia’s side to some developer-firms, they would never support this card directly, because it would never be a win for them in the next one year. I am personally very NVidia-religious, there’s developer-support is great, if by phone or by e-mail, I would never put any card except one from NVidia into my PC as well, but my second PC has an ATI-card and the third one an Intel-chip, because this is right now…how sad it is…the market, also if surely all of us developer would love to have scenes with 10.000 polygons per tree, bumbmapped, shadowed and with pixelshader support in all ways, but for today and also not for the coming christmastime, when we’ll let our actual products publish : VALUELESS.

    Michael Ikemann / Virtual XCitement Gmbh.

Mr. Calab, exactly what kind of software development are you employed in? I am a 3D graphics developer in an American university research lab, and like Zeno I bought a GeForce 3 to explore the new hardware t&l features of that card. I’m curious as to why you think the vast majority of the world does not need the capabilities of the GeForce 2, much less the GeForce 3. I count quite a few of my most recent game purchases (of the last year) that would perform poorly on less than a GeForce 2: Black and White, Nocturne, Serious Sam, Alice, and Blade of Darkness, just to name afew. Many of these games also run even better on the GeForce 3, playable at resolutions of up to 1280x1024x32 bit (the maximum resolution of my monitor), with quincunx or 4x anti-aliasing turned on!

Plus, a quick browse of the Nvidia website shows another dozen games in development that will specifically exploit the vertex and pixel shading capabilities of the GeForce 3. I realize that the average PC does in fact have an ATI, TNT, or Intel low-end 3D accelerator, but the fact is that a computer gamer’s machine is not the average PC, and both developers and consumers need to take advantage of new hardware to continue advancing the realism and entertainment value of PC games.

Originally posted by Chromebender:
[b]Mr. Calab, exactly what kind of software development are you employed in? I am a 3D graphics developer in an American university research lab, and like Zeno I bought a GeForce 3 to explore the new hardware t&l features of that card. I’m curious as to why you think the vast majority of the world does not need the capabilities of the GeForce 2, much less the GeForce 3. I count quite a few of my most recent game purchases (of the last year) that would perform poorly on less than a GeForce 2: Black and White, Nocturne, Serious Sam, Alice, and Blade of Darkness, just to name afew. Many of these games also run even better on the GeForce 3, playable at resolutions of up to 1280x1024x32 bit (the maximum resolution of my monitor), with quincunx or 4x anti-aliasing turned on!

Plus, a quick browse of the Nvidia website shows another dozen games in development that will specifically exploit the vertex and pixel shading capabilities of the GeForce 3. I realize that the average PC does in fact have an ATI, TNT, or Intel low-end 3D accelerator, but the fact is that a computer gamer’s machine is not the average PC, and both developers and consumers need to take advantage of new hardware to continue advancing the realism and entertainment value of PC games.[/b]

Hi Chromebender, I’m working in a gamesoftware-firm in Mühlheim, Westgermany and it’s surely true, that a Geforce 3 is fast than a Geforce 2 or 1, but this is not what I wanted to say. You said, that the gamers machine is not the average, so I am asking me, why there is no one of my friends, who are surely not poor and bought tons of games the last months has got a Geforce 3? And just about one of ten a Geforce 2 although they are all absolutely fanatic gamers? Of course the Geforce 2 and three offer a big ammount of possibilities, the Geforce 2 has some great reflection/refraction effects for example and so on and I saw some movies of pixelshaders in action already and yes, surely they are looking great and it’s no question that in the future this will be a standard anytime, but really just “anytime” and not at the moment. I think I will implement some Geforce 2 features into our game, T&L in any case, Geforce 1 is slowly getting a little piece of the market, but Pixelshaders still this year?! I hope, I really hope that NVidia will still publish a new chipset in the coming months, so that the price of the Geforce3 will fall like a stone, but if not then I will surely not waste my time for Pixelshaders, because then they also won’t have any market-value. I know that there are a couple of firms supporting Geforce3-features already, but because they don’t need to look for the money, but just for the image. The firm I am working in has to look at the proportions between investment and winning and to invest into the Geforce3 at the moment will surely not bring any penny more than without, that’s my opinion. And at the moment there is just one feature I am really waiting for and this is this are hardware-rendered volumetric softshadows without the needed use of stencil, if a card will be sold, which is supporting this feature, then this would be something really new, but when Carmack said “And now we are even able to render the pores of the skin” and he showed just one human in the Doom 3-demo and the scene was already not floating anymore, but like a half dia-show, yeah, “And this is the Geforce 3” ;-). It’s fast, surely, but in no way it’s a revolution, a card which can render ten million polygons at a framerate of 100, completely softshadowed, with 50 point-lightsources and reflecting objects, this would be a revolution…and then the “pores” Carmack talked about would also be a bit more realistic than his in my opinion more than ironic sentense.

      Michael

MrCalab, no offense, but could you start a new paragraph every now and then in your future posts? Makes it easier on the rest of us

Anyway, you’re overlooking the fact that this is a developer forum. It’s us, the developers, that have to adopt new cards like the GeForce3, not the end users. The end users will upgrade when they see games that make full use of the hardware.

Sure, if you want to ship a mass-market game like The Sims or whatever, you can’t have it require a GeForce3. If, on the other hand, you are working on a high-profile title, or if you are just a big graphics geek like most of us here, then why wouldn’t you want to experiment with the latest and greatest hardware? (If you can afford it, of course)

  • Tom

Yes, I think that advancement in the field of 3D graphics is a back-and-forth effort between the hardware vendors and us, the graphics programmers. First the chip designers add a few new features to their hardware, then it is up to the developers to entice consumers to pay for the new hardware. Sure, the average user doesn’t need to pay $400 for a little more eye candy but remember, critics once said the same thing about VGA too

Microsoft is more subtle in it’s tactics. It does things like refusing to release OpenGL 1.2 for Windows (even though they could have) while accusing the ARB of being glacial in it’s progress and promoting D3D’s functionality and streamlined development, even though huge swaths of D3D functionality are exclusive to a single IHV or even a single card. Furtunately Microsoft can’t stop IHV’s from implementing OpenGL 1.2 + extensions in their own ICD’s. IHV’s do this mainly because the market demands it thanks mainly to developers like John Carmack who leverage the popularity of their games to keep the world a freer place for us all.