I’m running a few small opengl demos found on NeHe’s site. When I run the .exe’s, I find them running very slow. The geometry in the examples isn’t overwhelming, either. I have an ATI rage pro, so I assumed the demos would be hardware accelerated. Is there a step needed to get the video card to recogzine the opengl executable and accelerate it. Any pointers on this would be helpful. Thanks!
There is nothing you have to do except install proper drivers for your card, which I assume you have since some applications do get hardware accelerated. If your driver support something in hardware, it WILL be accelerated, otherwise it will fall back on software rendering, which i think is the case when some applications is running slower. Maybe the slower applications is using some feature your card doesn’t support in hardware, and therefore falls back to software.
This is wrong, GLUT does not accelerate displays because the windows it creates are movable (this means that you can put the window in a position where it bridges 2 monitors and hence needs 2 graphics cards to render the seperate halfs (this dosn’t work)). To get an accelerated display use SetupGL from apple, it’s complicated but gives you huge speed gains (my game went from 20Hz to 180Hz (which is faster than the refresh rate of my monitor )).
GLUT creates a window for rendering, and if your drivers support hardware acceleration, these windows will be accelerated, period.
I dunno about dual monitors, but on single monitors you will get hardware acceleration if your driver allows it.
Are you using OpenGL ICD drivers ?
If not you won’t get any acceleration.
Of course, what else should we use?
Could be that the ATI Rage Pro may only accelerate at 16bpp…
In response to the earlier coment about glut being accelerated.
Glut USES a window it does NOT need to use one, windows cannot be accelerated for the reason I pointed out earlier about 2 monitors, there are only 2 things that can be accelerated: modal windows (windows that are not movable) and full screen displays i.e. also unmovable (these are also modal windows, they just take up an entire screen).
GLUT is NOT accelerated as can be shown if you write a program that uses GLUT and then write your own code to initialise full screen mode and use graphics acceleration. I have tried this, glut only managed 20 fps wheras my own initialisations provided 150 fps.
in reply to the comments about ATi rage pros only accelerating in 16 bit mode:
They may be poor cards but they aren’t that bad.
You can’t compare true Win32 fullscreen with GLUT’s fake fullscreen using glutFullScreen(). glutFullScreen is just a mazimized window on the desktop, and got the same resolution and colordepth as the desktop does. ALL, read, ALL, applications I have ever written in GLUT has been accelerated. When using glutFullScreen I have had some problem with hardware buffersupport, but that is due to the combination of my card and my desktopresolution. I have a 16 MB TnT card, and my desktop is 1280x1024x32, and my card can’t keep two 32-bit colorbuffers and a depthbuffer in that resolution. So, it’s my card and not GLUT that forces me into semi-software mode.
But when creating a window with more realistic size (500x500 or similar) I have no problems at all getting hardware acceleration.
I think this is the case on your computer too. Try create a 500x500 window (not fullscreen) in GLUT, and then with your own code, and see the result. Unless you have an old version of GLUT and/or really bad drivers, you won’t see any difference.
And by the way, all windows I’m talking about it as movable as any other window.
Well that’s odd because NONE repeat NONE of my GLUT applications are accelerated. I do have decent drivers (I just got the most recent ones straight of the ATi site).
I also did another test just to make sure. I created some OpenGL code which used display lists (i.e. if it is accelerated these should be stored on the graphics card and only calculated once) just to make the difference show some more I put quite a lot of slow calculations in these display lists (lots of sines and cosines). I also gave the GLUT version an advantage, I made it have a window of size 256ÊxÊ256 and I made my own version which initialised the graphics card properly draw full screen at 800ÊxÊ600.
If what you say about GLUT accelerating windows is true then the GLUT one should go a bit faster than the one I wrote.
My own version: 150 fps.
GLUT version: 3 fps.
WOW even I was surprised at that much difference, but then it does prove my point. GLUT DOES NOT ACCELERATE WINDOWS
:-(> < ) … (I’m sure this should be a vampire).
I know some (all?) Voodoo cards can only accelerate 3D in fullscreen mode. Perhaps your ATI card is similiar. Or perhaps your GLUT lib is improperly choosing a pixelformat for your ATI card and is errorneously choosing one that forces GL into software mode. Since just like Bob says, on my TNT, GLUT is accelerated, even in a window that is moveable.
To make it all clear here, GLUT doesn’t accelerate anything. I know I have said that GLUT do accelerate, but this is wrong. You know why? Because there is nothing to accelerate.
GLUT does ONLY create a window for rendering, thats all (apart from the parts for handling input and stuff, but that’s not about the rendering). ALL rendering is performed by the manufacturer-specific driver (or the default software driver is no special driver is present). So you can’t say GLUT does/doesn’t accelerate OpenGL, because GLUT doesn’t have anything to do with OpenGL, except to deliver a proper environment for rendering. In other words, GLUT’s work ends before OpenGL’s work starts.
If this “proper enrvironment” happens to missfit your ATI card, is not GLUT’s fault, nor is it Windows fault. It’s the drivers fault for not being able to handle this specific environment.
If you look into your code, you won’t see any call at all to GLUT that is actually drawing anything on it’s own. All rendering code is using gl* or glu* functions, which is NOT a part of GLUT, but a part of the core OpenGL, which resides within the driver. Even those GLUT functions for drawing primitives (glutSolidSphere for example) is using similar glu function with quadrics (i.e. still not drawing them on its own, but calling the core functions).
And I really don’t like the test you did. By saying you gave GLUT an advantage by giving it a lower resolution, you gave it a DISadvantage by giving it a windowed window (heh, funny combination of words) compared to the fullscreen in your own code. In windowed mode, you have to do window clipping, and take care of the rest of the desktop, which you don’t have to in true fullscreen mode (here is the disadvantage). Another disadvatage with GLUT is that you have no control at all when selecting pixelformat. With your own code, you can set a pixelformat that you KNOW suit your card/driver. GLUT doesn’t know about this, and may gives you an environment your card/driver doesn’t like.
And if you look into the GLUT source code, you will (amazingly) find native Win32 functions for setting up a window. The same functions at least I, and surely most other people, use when setting up a window on their own.
And to make a couterstrike, I also made a small application, which I tested on my dad’s computer and here at univerity (ATI cards in both testcomputers). Windowed using both GLUT and native Win32 functions was accelerated (I ensured this by querying the vendor and renderer from OpenGL, and it was WAY too fast for software), no noticable difference in frametare. Fullscreen using Win32 was accelerated, but not glutFullScreen because the dekstop was too large, and the card ran out of memory (card’s fault, not GLUT’s).
You said that GLUT cannot decide whether to accelerate a window all by it’s self, if it was well coded it could.
You apear to have a misconception that graphics hardware renders everything and anything, this is wrong graphics hardware is only ever used if and when the drivers and the card are initialised. Here is some proof of that: Load up just about any game that uses 3D graphics and look in the graphics options, there will be an option somewhere about where you decide what renderer to use, it will probably only have “Software rendering” and “**** Graphics card” in it. At this point ask your self the question, how does this game even know there is a difference between the two if what you say is true… The game knows the difference because it scans for all available renderers and then initialises one depending on what you choose in that menu. If no renderer is initialised the OS will take over and use software rendering.
As to what you said about my test, I realised straight after I posted the message that what I had done was flawed because I had changed 2 variables at once. So I slightly rewrote the glut code so that it too used fullscreen at the same resolution, there was no apparent difference. It would appear that the bennefit of having to render fewer pixels and the disadvantage of having to draw the background proccesess etc. canceled each other out almost exactly.
The results now truely show that GLUT does not accelerate windows.
Bob, I think you have a troll on your hands. No offense tdavie (if you are not a troll that is).
I generally don’t play that many games nowadays. Those games I do have played with OpenGL support, I have’t seen any choise about software rendering in OpenGL (thanks god for that one). Only application specific rendering (you know the good old days in DOS when you wrote your own rendering code?), and doesn’t have anything to do with OpenGL.
And I’m gonna say it again. GLUT doesn’t accelerate anything. It creates an environment for your driver, which in turn is accelerating the functions it can. And (your drivers) != GLUT. GLUT is mainly for demonstration purpose, for demonstrating effects and techniques. That’s about it. So I’m surprised you are using it at all.
But I had a look at the first message, and gte727h didn’t mention anything about GLUT. You brought this up, but all of a sudden tossing a comment about GLUT, which was a bit out of topic.
And by the way, your test only shows your programs isn’t accelerated on YOUR computer, not mine, nor my friends, nor anyones computer, except for your own. Same with my test, but my test shows it CAN be accelerated when creating windows with GLUT, so GLUT CAN create accelerated environments and yor argument is not valid any more.
(my smiley-army is better than yours)
p.s. Anyone else here reading this extremely friendly conversation, and might want to share their experience about GLUT being able to creat accelerated/nonaccelerated windows?
[This message has been edited by Bob (edited 09-19-2000).]
- GLUT doesn’t accelerate anything: I’m glad you finally aggree with me if you look at my first message I say that the reason for the tutorials being slow is that GLUT does not accelerate anything.
- I brought up the topic of GLUT: That’s because I learned OpenGL using the NeHe tutorials and I know that they use GLUT.
- You havn’t seen any options to chose your renderer: I state a famous quote “The absence of evidence is not evidence of absence”. I have seen these options in almost every game I have (the only exception being the Sims).
- My test only shows that GLUT dosn’t accelerate anything on my computer: True but I have seen the NeHe tutorials on other computers running extremely slowly when all they are generaly drawing is a cube (most of the earlier ones).
- I notice that one of your smilleys is already scared after seeing just the front rank of my army (the one going ). Here is just one legion (it may take a while to scroll, but persevere):
- I am NOT a troll.
tdavie, you seem to be missing Bob’s point.
GLUT is just a layer of code on top of OpenGL. GLUT just uses OpenGL, and your OpenGL driver should accelerate the graphics, within any screen size constraints, or other pixelformat constraints of your video card and video card driver. If your GLUT based code is not accelerated, then it is because one of those constraints is not being met by your OpenGL driver. GLUT is just a template, its only purpose is to make writing OpenGL programs easier.
Regarding point 3, I don’t follow. What is the point of that point? If you want to have multiple renderers available, then you have to supply them. OpenGL is not going to provide a D3D HAL renderer, or any other renderer. That is your job.
I’ll refrain from introducing another smiley to the page. ;->
[This message has been edited by DFrey (edited 09-20-2000).]
Sorry tdavie, but Bob is right on this. I have written MANY applications, some with GLUT, some with Win32, some with MFC, some fullscreen, some windowed… All have taken advantage of hardware acceleration.
Have you tried messing with your desktop settings before running your windowed apps? It is possible that you current resolution/bit depth will not produce an accelerated window.
But regardless of why your GLUT apps are not getting accelerated, the problem is NOT GLUT in a window, it is something else.
Ah, but I just thought of somehting else. In your first post you comment about multi-mon issues with acceleration. Are you running a dual monitor setup? If so then this could very well be your problem. Windows implementations of multi-mon are known to have many issues with windowed hardware acceleration. Windows 2000 is a little better than 98, where I’ve acually been able to get hardware acceleration with 2 monitors, but only on the primary monitor, not the secondary (and really unpleasant things happen if you try to move the window to the secondary monitor. If you are running multi-mon, try disabling the secondary monitor.
If not, then disregard that last bit.
And one more thing… You talk about using “SetupGL from apple.” Are you running on a Mac by any chance? I personally only have experience with GLUT on PCs and SGIs.
[This message has been edited by Rob (edited 09-20-2000).]
I see the point about GLUT just being another layer above OpenGL which provides utilities for creating windows, handling events etc., OpenGL is also a simmilar layer which adds to the OS by alowing you to draw pretty graphics, this does not matter.
My point is that a graphics card MUST be initialised before it is used, if you do not initialise a graphics card, all that happens is you get software rendering.
I have never know GLUT to do the right initialisations to get the graphics card working properly. Also as far as I know an ATi Rage Pro, for all it’s failures can handle a wide variety of pixel formats.
As regards to point 3, you seem to have slightly misunderstood me, by ‘renderer’ I mean the gubins that procceses the OpenGL code (and not whether it is processing OpenGL, Direct3D, Glide or QD3D) e.g. on my computer I usualy get 2 options - ‘Software rendering’ or ‘ATi RAGE Pro’. The reason for me making referance to this is that if a graphics card was automaticaly chosen by the OS (the graphics hardware is always used and does not need initialising) then it would not be possible to have this menu, hence it must be possible to decide what graphics renderer to use. By default, the Software rendering is used and NOT the graphics card.
p.s. I would like to use just one more smiley and then put an end to ‘SMILEY WARS’