Configuration change killed my FPS

Hi,

with summer heat, I had some material problems, and had to replace my mother board, RAM and hard-drive, only keeping my initial GC, CPU and SoundCard.

I was working on a GL game engine when problem occured, it was running at ~54FPS(avg) with 500 instances and, since the full system reinstall, I can’t have better than 3FPS whether I have 300 or 5 instances!!!

I thought the GL switches in code beginning were sufficient to define parameters whatever card/system the program would run on but it seems I was stupidly optimistic and something sucks my whole framerate.

Did anyone experienced such problem and has any clue or suspected possible cause?

Thanx.

Edit:
my system’s got v5.0.2195.6611 of OpenGL32.dll and I only use gl native instructions (no glu/glut stuff).

Download a recent driver for your graphic card.

>> v5.0.2195.6611 of OpenGL32.dll
This is not meaningful as this dll is from microsoft and hasn’t changed for 10 years. The actual dll used for your card depends on the vendor (ATI, Nvidia, etc)

What does report Glinfo ?

Drv v6.14.10.6693
GL v1.5.2
SLV v1.00 NVidia

Hmmm it’s just chinese to me…
GC drivers are the latest I found but for GL I’m not sure it’s the last one, nor if a later version would correct the problem. Such a performance difference would be impressive.

If you changed your motherboard but use your old harddrive, did you install the correct motherboard chipset drivers?
If your graphics board is AGP, check your Device Manager’s System Devices if there is something with “AGP” listed. If there are unknown devices, that can be missing chipset drivers.
NVIDIA graphics driver 66.93 is out of date. Check the display control panel after you installed newer ones found here http://www.nvidia.com/content/drivers/drivers.asp
OpenGL drivers are part of the graphics driver package. You can’t update them independently.

I changed motherboard and hardrive (RAM too), so I reinstalled everything.
Even with the last NVidia drivers from your link, I experience this low rate problem.

In fact, I don’t find the setting I was used to for the card (like antialias, Zdepth, etc…).

I don’t see what could have worked for two years and mess now, with the same card.
I tested this code on two machines (mine and a friend’s one), and I had 44fps on his, 53 on mine. Tested with the very same executable tonight, I have 44 on his and 7 on mine.

Since the code didn’t change, I guess the CreateGLWindow routine is not to blame, but figuring out what has been changed between default card sets and working ones on my previous conf’ and his, would help to know what to force with soft, in case.

Beside, system has recognized the card and drivers since SWAT4 and SilentHunter3 use them correctly. That makes me think it’s a pure soft/setting problem that a correct init in code could correct.

The code GL init is this

	 static PIXELFORMATDESCRIPTOR s_oPfd = {
		sizeof( PIXELFORMATDESCRIPTOR ),
		1,							// version number
		PFD_DRAW_TO_WINDOW |		// format must support window
		 PFD_SUPPORT_OPENGL |		// format must support opengl
		  PFD_DOUBLEBUFFER,			// format must support double buffering
		PFD_TYPE_RGBA,				// request RGBA format
		32,							// color depth bits
		0,0,0,0,0,0,				// color bits ignored
		16,							// alpha buffer (0/8/16/32)
		0,							// shift bit ignored
		16,							// accumulation buffer bits		(0)
		5,5,5,1,					// accumulation bits ignored	(0,0,0,0)
		32,							// zbuffer depth bits (16/32)
		0,							// no stencil buffer
		0,							// no auxilliary buffer
		PFD_MAIN_PLANE,				// main drawing layer
		0,							// reserved
		0,0,0 };					// layer masks ignored

but worked on previous conf’…

…reason why I asked if someone faced and solved such a situation.

Found why:

My Bureau was configured as 16bits while my GL settings in app’ were 32 bits.

I understand why it caused some problem in windowed mode, but can’t figure out why this problem also occured in full-screen!!?!??

16, // alpha buffer (0/8/16/32)

32, // zbuffer depth bits (16/32)

To me there alpha can only be 8 bits (on consumer cards), and zbuffer can only be 16 or 24+8stencil.
It might explain why it did not work in fullscreen either.

For alpha, 8bits are enough but for ZBuffer, 24 seems quite light. It could explain some of the face flicking effects I have on relatively short distances.

Maybe a “GL_LESS” would be better than a “GL_LEQUAL” in DepthFunc but it’s another problem.

Here it seems the color depth difference between mainwindow setting and application setting was to blame. Back to 50fps… from 7fps, quite a gain!

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.