PIXELFORMATDESCRIPTOR Blues

OK … this sucks …

I’ve got my own OpenGL framework. I just put a frame counter into it. Guess what … at 800x600x32x24x8 … I get a monumentally disappointing 260 FPS!

the code goes something like like …

check message loop
glClear(depth and colour)
swapbuffers

and that’s it! I tested the nehe base code, at 1024x768x32x24x8, which knocks out around 600 FPS when not drawing. So something is very wrong here. But - to make things worse, when I went to 800x600x16 in my own code, the FPS wend DOWN to 160 FPS.

I really can’t work out what’s going on here. There is absolutley the bare minimum windowing code going on. Yet I definiately have a hardware accelerated PixelFormat.

Any ideas would be much appreciated.

BTW I have just installed the Nov2001 windows SDK. But this shouldn’t make any difference.

hm, this is really strange…
perhaps you could dissect nehe’s code. let some parts out, and look when it loses speed. oh, but you compiled nehe’s code on your machine, right?

l8ers,
Tolga.

Yeah - I just ripped the drawing code out and recompiled the nehe stuff.

But like I said - my code is now so simple it isn’t true. I’ve commented out all the DirectInput stuff, so only the bare minimum is left. It really is simple stuff.

RegisterClass
ChangeDisplaySettings
CreateWindow
SetPixelFormat
CreateGLContext etc

Everything works fine … apart from the speed. There are no errors etc. The pfd shows a correct format. The renderer is NVIDIA, not MS.

I’m stuck!

yeah, but you shan’t rip your program, but nehe’s step by step. you know, enabling some stuff could increase speed either (glHint, culling, …). try it. i know, these problems are horrible, and at the end you realize something really stupid (at least, these were my experiences :wink:

so long,
Tolga.

Here’s the main loop.

  	DWORD t = GetTickCount();
  	long c = 0;
  	while (MessageHandler() == true)
  	{
  		glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
  		c += 1;
  		SwapBuffers(hDC);
  	}
  	Dump("%i %i", c, GetTickCount()-t);

MessageHandler returns false when a WM_QUIT message is received
c is just a simple frame counter
t is number of milliseconds ellapsed
The Dump function just wrintes the two values to a text file.

[This message has been edited by Shag (edited 01-24-2002).]

looks quite ok, no? how about your initialization? is it exactly the same as in nehe’s?

Here’s the main window creation function …

bool wndCreateOpenGLWindow(long Width, long Height, char ColorBits, char DepthBits, char StencilBits, unsigned long DisplayFrequency)
{
if (!wndChangeDisplayMode(Width, Height, ColorBits, DisplayFrequency)) return false;

hWnd = wndCreateWindow(Width, Height);
if (hWnd == NULL) return false;

hDC = GetDC(hWnd);
if (hDC == NULL) return false;

if (!wndSetPixelFormat(Width, Height, ColorBits, DepthBits, StencilBits)) return false;

if (!wndCreateOpenGLContext()) return false;

ShowWindow(hWnd, SW_SHOW);
SetForegroundWindow(hWnd);
SetFocus(hWnd);

dspOpenGLAspect(45, Width, Height, 10, 10000);

glClearColor(0.2f, 0.2f, 0.3f, 0.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

SwapBuffers(hDC);

return true;
}

couldn’t be simpler!

how about state initialization? depth test, shading etc???

oh, and should get rid of that deprecated gl aux library. at least on my system, i have none. does anyone know about performance?
rather use glut or your system specific windowing routines…
and use glPerspective() for example instead of dspOpenGLAspect(). and why is your far plane at 10000? use 1 for near, and 100 for far. it should be sufficent. and then, why do you do a swap during initialization?
how about clear-depth?

bye…

Why get rid of glaux? It does nothing!

That’s my problem - the code above is all there is … why is it so slow?

[This message has been edited by Shag (edited 01-25-2002).]

I do a swap during the init because I was originally loading textures - it just made things a bit more interesting!

Problem Solved!

dwStyle = WS_POPUPWINDOW;

should be …

dwStyle = WS_POPUP;

Duh