Download OpenGL v1.2, 1.3 or 1.4

You know what I know this will keep going on so this maybe will stop it this is another topic but realy is on open gl vs. d3d.So this would of been the result of are nonsense.

Author Topic: OpenGL vs Direct3D
skater_g
Contributor posted 03-13-2001 01:18 PM

I’m a writing a persuasive paper for HS and I’m wondering what experiences people have had with OpenGL and Direct3D. Which do you personally prefer and why? I only use OpenGL. Are there any drawbacks that you have found in your preferred API? Is there a different preference when using different languages, such as Visual Basic or C? Are there any areas of improvement that you think will need to be addressed in future versions? Please tell me whatever you are willing to share. I really would value your input. Thanks.
Bryan

john
Frequent Contributor posted 03-13-2001 03:30 PM

Hi! I’m wondering how to fuel a bush-fire. see, i’ve got several hundred drums of petrol, and this works, BUT i can only turn several hundred hectares of scrub into a smouldering mess. I was wondering… has anyone had much sucess with nuclear weapons?
cheers,
John

[This message has been edited by john (edited 03-13-2001).]

Gorg
Frequent Contributor posted 03-13-2001 04:21 PM

Here we go again!!

Deiussum
Frequent Contributor posted 03-14-2001 11:09 AM

John, I think your nuclear weapon idea sounds like a good one. Not only do you get a lot of initial damage, but also the benefits of the whole mushroom cloud and nuclear fallout thing.
Skater_g, do you seriously think you’re going to get objective opinions on OpenGL vs. Direct3d on a board that is primarily to discuss OpenGL? Post the same thread on a Direct3d board and you’ll get opions that are completely different.

So far as differences between the two, OpenGL is easier to learn for many people, but a lot of people claim D3D is easier…

OpenGL has the whole extension thing going for it, while D3D gets a new update every year, (and in some cases those changes require a drastically different approach to the way you program.)

skater_g
Contributor posted 03-15-2001 06:39 AM

Deiussum,
Thank you for your response. It is all I need for the 4th source for my paper. Hehe…I knew it wouldn’t be the best idea to post the message on an OpenGL board, but since I’m writing my paper to persuade people to use OpenGL, I figured this would be a good place to get some info…some serious info.

In case anyone is wondering, I cannot stand to use D3D. I dislike it with a passion, and I refuse to code with D3D anymore. Sorry though, I shouldn’t have even brought it up…

john
Frequent Contributor posted 03-15-2001 04:30 PM

but, if you dislike something, then you presumedly know WHY you dislike something. I don;t like… oh, say. COBOL because of very well defined reasons: the syntax structure is brain DEAD, the semantics are screwed in the head, and any compiler on a unix system that is called “rmc” is just screaming for trouble. (“rm” being the ReMove command… which is nice, but not when some of us press space bar to oearly=)
but… i know why i don’t like cobol, and i can site reasons why it IMHO sucks… so i don’t need to ask ppl why it sukcs when i already know why!! so, if you’ve used D3D and don’t like it (which you say you don’t…) then… why ask ppl for their opinions when you can form one yourself? “i think opengl is better than d3d for reasons X, Y and Z, with the following supporting evidence”…

feh!

cheers,
Hohn

andreiga
Contributor posted 03-18-2001 05:40 PM

With OpenGL you can have per-pixel diffuse+specular lighting on Geforce2 family chipsets (GTS, MX etc.) through the register combiners extension, like in the Doom3. I said like in Doom3 but this doesn’t mean that Doom3 will run on a Geforce2. Indeed, Geforce3 it’s more powerful (adds more combiners and constants plus a new extension named texture shader), but this doesn’t mean that Geforce3 it’s a must (for now).
Unlike OGL, DX8 wants only the new advanced NSR from Geforce3 (which it’s STUPID) and makes Geforce2 a Geforce1 with more fillrate.
In conclusion, in the future you may see games with per-pixel ligthing running only on OGL (because of the number of Geforce2 family chipsets on the market).
One other important thing it’s the support of the fence/VAR extension which is MUCH better implemented than the DX8 vertex buffers, so you can have much more triangles per frame rendered (believe me, I’m first a DX programmer and second an OGL programmer).
P.S. Vertex programming extension (or DX8 vertex shader) is a plus of Geforce3 but i’ve seen recently some benchmarks made with 3dMark 2001 and the emulation (which i think it’s done completly on the CPU, but i’m not so sure about that) is almost as fast as on the Geforce3 (this extension is required as a setup of per-pixel lighting).

LordKronos
Frequent Contributor posted 03-18-2001 06:05 PM

quote:

Originally posted by andreiga:
With OpenGL you can have per-pixel diffuse+specular lighting on Geforce2 family chipsets (GTS, MX etc.) through the register combiners extension

Well, the drivers are supposed to have a way that allows D3D8 apps to get access to the register combiners. Cant say for sure which functionality is available this way, but its something you should be made aware of.

quote:

Originally posted by andreiga:
Unlike OGL, DX8 wants only the new advanced NSR from Geforce3 (which it’s STUPID) and makes Geforce2 a Geforce1 with more fillrate.
Just so you know, the Geforce2 IS a Geforce1 with more fillrate. While there are architectural differences between the cards, there is no difference in functionality. Both cards support the exact same features. From a user/developer standpoint, the only difference is speed.

[QUOTE]Originally posted by andreiga:
One other important thing it’s the support of the fence/VAR extension which is MUCH better implemented than the DX8 vertex buffers


I cant argue this for a fact, but if you use vertex buffers correctly (discard contents & no overwrite), I believe the D3D driver should be able to make use of the fences to get optimal results. Maybe Matt can confirm/refute this, but I wouldnt be surprised if he never even comes in here (given the title of the thread).

mcraighead
Frequent Contributor posted 03-18-2001 10:25 PM

I don’t know about DX8, but on DX7, VAR was definitely superior. I think DX8 fixes some but not all of the DX7 vertex buffer problems.

  • Matt

andreiga
Contributor posted 03-19-2001 02:13 PM

quote:

Originally posted by LordKronos:
Just so you know, the Geforce2 IS a Geforce1 with more fillrate. While there are architectural differences between the cards, there is no difference in functionality. Both cards support the exact same features. From a user/developer standpoint, the only difference is speed.

Reply:

Maybe what you say it’s right (since i have a GTS and i never had a Geforce1 to see if the register combiners works), but don’t forget that the NSR is a new component in Geforce2.
One more thing, in Nvidia Opengl SDK they show the power of register combiners and how to use them but ONLY on Geforce2 and Geforce3
(the later with an enhanced version of this extension).

Lars
Frequent Contributor posted 03-19-2001 05:04 PM

And you can do per pixel lighting on the Geforce1 and up under direct3D by using the dotproduct operation for the texturestagestate… don’t know the syntax at the moment, but it works. You haven’t got the full freedom, that the combiners give you of course.
Or wasn’t there something ??? I remember a hack for the TNT cards, where you could use there combiners by setting specific stages to specific states in the d3d pipeline.
Maybe you can do this with GeForce too :slight_smile:
Lars

LordKronos
Frequent Contributor posted 03-19-2001 05:11 PM

quote:

Originally posted by andreiga:
[b]but don’t forget that the NSR is a new component in Geforce2. [b]

What features does the NSR provide? I’ll tell you…NONE. Its a marketing term. Everything (feature-wise) in the NSR was available on the GeForce 256. Now, certainly the 256 didnt have 4 pixel pipelines (which is part of the “NSR” thing), but all additional piplines equates to is performance. Everything a Geforce 2 can do (or at least everything nvidia has disclosed thus far) can be done on a GeForce 256.

andreiga
Contributor posted 03-19-2001 06:07 PM

quote:

Originally posted by Lars:
And you can do per pixel lighting on the Geforce1 and up under direct3D by using the dotproduct operation for the texturestagestate… don’t know the syntax at the moment, but it works. You haven’t got the full freedom, that the combiners give you of course.
Or wasn’t there something ??? I remember a hack for the TNT cards, where you could use there combiners by setting specific stages to specific states in the d3d pipeline.
Maybe you can do this with GeForce too :slight_smile:
Lars


As i said before i don’t know how a Geforce1 is working under OGL (maybe is working the same as second generation), but the point is that under DX everything is set as render states (light vector etc.) and you can do only diffuse per-pixel lighting (you can’t set the half-vector, specular power which are necesary for specular lighting). I have to remember the name of the topic: OGL vs D3D. Of course, if you have a GF3 the same lighting computations can be done under DX8 and OGL, but how many GF3 are on the market?

Gorg
Frequent Contributor posted 03-19-2001 08:58 PM

You can use EMBM in Direct3d to make specular highlights if you don’t have access to pixel shaders.

kaber0111
Frequent Contributor posted 03-19-2001 11:30 PM

>to is performance. Everything a Geforce 2
>can do (or at least everything nvidia has
>disclosed thus far) can be done on a
>GeForce 256.

marketing scam, hehe
microsoft pasting in game characters into there screenshots…

now that IS a marketing scam.
pfff.

XBox is a blunder;
yeah, i like the hardware, but personally i think it will be an enormous flop.
ms is going at this in the wrong way,
and it’s going to cost them.
and i think the street is frowning on there actions as well.

-akbar A.

andreiga
Contributor posted 03-20-2001 04:15 AM

quote:

Originally posted by Gorg:
You can use EMBM in Direct3d to make specular highlights if you don’t have access to pixel shaders.

Yes, but EMBM is not suported on GeForce1 and GeForce2, and again, how many GeForce3 are on the market? (i’m not considering the G400 because of the poor performance and ATI RADEON because of the very, very, poor drivers).

Ingenu
Frequent Contributor posted 03-20-2001 04:26 AM

KYRO and KYROII (PowerVRS3) supports EMBM.
They are best quality/speed/price ratio on the market.

LordKronos
Frequent Contributor posted 03-20-2001 04:54 AM

quote:

Originally posted by kaber0111:
marketing scam, hehe
microsoft pasting in game characters into there screenshots…
now that IS a marketing scam.
pfff.


Marketing scam? Well calling it a scam is debatable. Certainly the features of the NSR werent new compared to a GeForce256. However there was a performance increase, and as someone from nvidia said to me on the topic once (Im paraphrasing) “no its not new, but often the increased performance can make the difference between these types of effects being feasible or not”. Then again, isnt most marketing a scam in one form or another?

As for the XBOX screenshot “scam”, I would disagree with you, but thats way off topic so I wont bother.

cass
Frequent Contributor posted 03-20-2001 05:19 AM

Being in the marketing dept at NVIDIA, I would say it’s about “spin” and timing. There was a right time to push hard for NV_register_combiners as a “branded” feature – and that time was GeForce2 launch. NSR was a much more suitable name for marketing.
NV_register_combiners was pushed to developers from the GeForce256 on, because for consumers to enjoy a feature like NSR, developers have to program to it.

Thanks -
Cass

ET3D
Frequent Contributor posted 03-25-2001 12:47 PM

I’m programming OpenGL now, but I’m following D3D, and might start programming it at some point. At least in terms of simplicity, I think that D3D is getting better with every version. I like the easy loading of textures (including video textures), and the idea of Effects. I’m glad that when I’ll finally get into it, it’ll be considerably easier than when I started following it (which was with DX6).

Nutty
Frequent Contributor posted 03-25-2001 01:20 PM

[RANT]
bah… dont you mean “considerably gayer”???

I really can’t understand the fuss about D3D? I can’t find one single reason to use it over OpenGL. If someone can give me some good reasons to learn it… I might just do that…

M$ have already buggered up the OS market with their flimsyware/bloatware crap OS’s… and even worse products… (though in defence I do like Visual C++) And it seems they want to dominate the 3D market with their lame plagurised API.

I’ll be smiling ear to ear the day M$ go bankrupt…

[/RANT]

Nutty.

jwatte
Frequent Contributor posted 03-25-2001 05:43 PM

quote:

any compiler on a unix system that is called “rmc” is just screaming for trouble

Do what I do, and just rename it “cc” for “cobol compiler” :slight_smile: :slight_smile:

kaber0111
Frequent Contributor posted 03-25-2001 09:26 PM

>I really can’t understand the fuss about
>D3D? I can’t find one single reason to use
>it over OpenGL. If someone can give me some
>good reasons to learn it… I might just do
>that…
Umm, I know this is a OpenGL forum…
but, it’s something called fragmenting.

like ati has “there extension” to do dot3lighting in OpenGL,
where nvidia has there “own way” to do this
as well…

but in D3D, it’s unified and there is only 1 way/interface to do it…

CViper
Contributor posted 03-25-2001 11:46 PM

Well i’ve been “learning” D3D out of curiosity lately… Well out of mistake d3d rm (retained mode or something)… First you have to battle your way through about 10’000 com objects, and then you still cant do what your really want to (you can load files directly, but specifying verticles directly well forget it). I guess d3d im (immediate mode) is somewhat better in that way (i don’t really know though).
And a little note to the xbox stuff: They froze Halo (the bungie game) until theyr going to release the xbox… just because that i hate m$… and that they support d3dmore that ogl makes that just worse

Nutty
Frequent Contributor posted 03-26-2001 02:32 AM

It’s nothing more than trivial to write a wrapper for several extensions of the same functionality… Thats basically all D3D does…
Though with OpenGL, you might get access to more features (ala pixel shaders) as you’re not limited to D3D’s implementation.

Nutty

zed
Frequent Contributor posted 03-26-2001 03:56 AM

>>like ati has “there extension” to do dot3lighting in OpenGL,
where nvidia has there “own way” to do this
as well…<<
i just noticed a few more extension specs have been posted including http://oss.sgi.com/projects/ogl-sample/registry/ARB/texture_env_dot3.txt

(wtf is there a clear fields button for, he saiz after pushing that instead of send)

kieranatwork
Frequent Contributor posted 03-26-2001 12:32 PM

I don’t know about in America, but here in England, seeing Bill Gates announce the XBox in his knitted sweater accompanied by some annonymous WWF wrestler did nothing for its’ potential appeal to 18-30 year olds…he doesn’t seem to realise that the average console buyer does not know or care what pixel/vertex shaders are, or that it’s fill rate is double/treble that of the PS2 or whatever whatever whatever…they’re interested in innovative, exciting and most probably Japenese games.
They’ll continue to buy PS2’s, and the XBox will go the same way as the MSX…straight into the classified ads in local newspapers. Shame…but I hope to god that NVidia have not invested too much in this doomed project - I think they deserve better…

Nutty
Frequent Contributor posted 03-27-2001 01:19 AM

Not sure I agree with that. Being in the games industry myself, I reckon it’s gonna totally stomp all over PS2. It’s basically a fixed spec highly optimized PC… and there are loads of developers out there that would love to get their hands on this thing…
PS2 aint really that hot… it’s really useless at textures… only 4 meg of VRAM, and no hardware texture compression.

Another good thing about XBox, is that it’s really easy to develop for… Basically just get a geforce 3 pc… with dx 8, and you’re 90% there… Very easy to port existing Pc games too… especially if they were already written using D3D…

Nutty

zed
Frequent Contributor posted 03-27-2001 01:10 PM

ive written a paper entitled “why the xbox wont suceed”
as nutty saiz, its gonna be a hit with developers unfortunately the developers DONT buy the games.
kieranatwork is far closer to the money

maxuser
Contributor posted 03-27-2001 02:36 PM

Few people despise M$ more than I do, but in the PC software industry, they’re the 800-pound gorilla that can bully everyone else, and they usually get what they want. Why else could an inferior API like the early versions of D3D survive? M$'s often successful strategy is to put out an inferior first version (like Windows, IE, D3D, etc.), let the industry insiders laugh at it, keep chipping away at market share with iteratively better versions, until they completely dominate the market with a product that is “good enough.” I doubt XBox will be any different.

maxuser
Contributor posted 03-27-2001 02:58 PM

As far as the original topic (D3D vs GL), it’s a matter of personal preference and intended use. If you like pure, clean, “academic” API’s, OpenGL is for you. (It’s clearly not just academic, as John Carmack as proven.) If you don’t mind, or even enjoy, getting your hands dirty with often needless complexity and you care only about supporting the latest 3D features on Windows, then D3D is worth a look. I personally like to experiment with somewhat academic 3D stuff on Mac OS X, so OpenGL is the obvious way to go. BTW, OpenGL support on Mac OS X is far superior to the second-class support that M$ provides (you can write an OS X OpenGL-based screensaver in about 50 lines of extremely simple code; try that on Windows), but that’s another topic…

Nutty
Frequent Contributor posted 03-28-2001 12:39 AM

Yeah… Mac OpenGL seems to be getting even more support. Didn’t I read somwhere that apple have dropped their own api, to push OpenGL more on the Macs
If only M$ would do that too…

Dont you think it would be better if MS dropped D3D, and pushed for better OpenGL support under windows, then all major platforms and OS’s would have a common top of the range 3D api… it would rock… but no… instead they have to be stubborn gits, and force their ****e on us as always.

Nutty
Frequent Contributor posted 03-28-2001 02:05 AM

oooops… having trouble accessing forum boards… laggy… and unresponsive. hence double post… wont let me delete it though…
odd.

maxuser
Contributor posted 03-28-2001 06:00 PM

Apple realized that they didn’t have the market share or momentum to push their own 3D API (QuickDraw 3D), so they wisely adopted OpenGL for Mac OS 9, and inherited it from NeXTStep/OpenStep for Mac OS X. (BTW, there’s a Q3D-compatible API implemented with GL called Quesa, for those interested in yet another retained-mode layer over GL.) M$, on the other hand, does have the market share and momentum to successfully push their own API. Smaller platforms (like Mac and Linux) can only survive by adopting standards, whereas larger platforms (M$) will survive by driving a stake into the heart of any standard that would level the playing field with the small guys. They’ve done it with client-side Java, and I sure as hell hope they don’t do it with GL. I don’t like it, but that’s how business works.

andreiga
Contributor posted 03-30-2001 03:12 AM

M$ hates OpenGL because they can’t control it (no company to buy in order to own OGL). Besides that, they hate it even more because OGL specifications are not made by marketing guys (which it would be very wrong).
P.S. I’m a marketing guy at a software company.

Hull
Frequent Contributor posted 03-30-2001 02:09 PM

And that’s why we love it
I personally dislike DX because of its platform dependancy and the M$ evil plans
to ‘take over the world’ behind it

I used to think Mac was a waste of money, but with their frenetic support of OpenGL and impressive good judgement, I have been starting to think about buying one and start developing on it.
( money issue only here. )

I think JC have something to do with it to

Nutty
Frequent Contributor posted 03-31-2001 02:57 PM

Been talking with some of my mates from work… and we still dont seem to be able to percieve why Xbox might fail. Maybe you’re under the impression, that it will cost loads, seeing as the gfx behind is gonna be a superset of Geforce 3. It wont. Apparently it will take M$ 5 years to get into profit due to the loss they will make selling the machine at such low cost.
If I saw a console out there with a better than geforce 3 spec’d graphics system in it… for say 200 quid… BARGAIN! I’d jump at the chance to get one.

I still reckon Xbox will totally stomp over PS2… Probably gamecube too… but that looks like a much nicer system than ps2 as well… I really think that sony have misjudged with ps2…

just my tuppance worth.

Nutty

All times are ET (US)
next newest topic | next oldest topic

Administrative Options: Close Topic | Archive/Move | Delete Topic
Hop to: Select a ForumList of Forums:Category: USERS - Gamers, Professional 3D Users, Consumers--------------------User Hardware, Software, & Gaming HelpCategory: DEVELOPERS--------------------OpenGL coding: beginnersOpenGL coding: advancedOpenGL under LinuxOpenGL on MacintoshHigh-level APIs (e.g. Inventor, Performer, Optimizer)OpenGL for embedded systemsSuggestions for the next release of OpenGL

Contact Us | OpenGL.org

Powered by: Ultimate Bulletin Board, Version 5.43d
© Infopop Corporation (formerly Madrona Park, Inc.), 1998 - 2000.

In case no one has noticed, this is an OpenGL forum. Quite obviously, people here will tend to think that OpenGL is better than D3D.

You know you can posts links, too, right? I’m not quite sure what your point was in reposting an entire thread that is 2 1/2 years old, but I could probably post dozens more just like it. Not only that, but there’s a point in there where mcraighead (who was an nVidia driver writer) actually stated that OpenGL’s VAR was superior to Dx7, and possibly 8. That seems to counter your previous argument that D3D is faster…

The simple fact is, you can’t back up your claim that Direct3D is always faster than OpenGL. Why don’t you just admit that and be done with it? It’s like you are trying to debate the price of coffee by saying the sky is blue. You are challenged on one simple point of your posts, and you respond with something totally unrelated to what you were challenged on.

Oh well… I get the impression that you are young, and therefore “always right.” You will grow up some day.

First of all I posted this to stop this nonsense and also dx9 is out so get up to date will you!

Now you think that im a child? Well maybe this is true maybe it is not. One thing is for a nother tho. See I siad when I posted that long thread this may stop the nonsense. But you continue to fight I’m th child? If it will make you happy I will say that opengl is faster then d3d. (even tho I don’t realy think this)Now I hope you will stop speaking this childish nonsense. So I will repeat d3d is not as fast as opengl in some scenarios. Now you may stop.

I replied to this nonsense because it was on top. I have to give “who cares” credit for being an annoying troll and this is his only intention. Good move to post a complete thread instead of a link.

Anyway, its something sick about it.

You know what just to prove I know some things I will give my age it is OxFF or maybe OxOA ok thats enogh of this thread.

Dont worry about that.
The only thing you should think of is why you do it.

It’s ok if you do not understand that now this is the last time I will reply to anyting in this topic as long as no one recanointers for dirt.

Thanks!
It was a serious suggestion, you have something to think or perhaps even worry about.

Is this a threat? Are you threatining my comp? Are you repecting my athoratahhhah!

Please, you dont have to tell anyone but ask yourself why you do it. You are threated but not by anyone but yourself.

Thank you some guy or is it Latrans. Who is OxFF he is old.

No, I am just some guy.
Glad that you enjoy it but this is my last reply.

Wait you dumb brat you peice of crap that was not my last reply! You dam troll! I hate all of you!

I don’t know what you mean by OxFF ok and I do not want to know I am a C\C++ guru.

he he, I guess that if I should continue to post here do I have to register.

Ok I do not know what OxFF or 10ff or what ever all I know is that I am a C\C++ guru and I do not need to know what OxFF is or OxOA is and thats that!

Ok now I’m a reg. and I am a c\C++ guru and I dont know daaaaaaaaa what hex is hahaha!

Originally posted by Whocares:
First of all I posted this to stop this nonsense and also dx9 is out so get up to date will you!

Did you even read the “nonsense” you posted? Let me point out one little part…

mcraighead
Frequent Contributor posted 03-18-2001 10:25 PM

I don’t know about DX8, but on DX7, VAR was definitely superior. I think DX8 fixes some but not all of the DX7 vertex buffer problems.

  • Matt

Note, he mentions DX8 and DX7. This was 2 1/2 years ago, if you bothered to even look at the date of your own drivel.

Believe that D3D is faster if you like. It still doesn’t change the fact that it isn’t in all cases. How ironic that in your attempt to be as annoying as possible, you may have found a statement by a respected member of nVidia to prove you were wrong. I find that laughable.

Your responses are all pretty immature, which is what makes me think you are young. Instead of trying to counter my arguments to your claims with facts that can be actually be backed up, you respond with drivel and a defensive attitude.

[This message has been edited by Latrans (edited 08-29-2003).]