Recently, it was noted by one of our developers that there was a substantial improvement in frame rate occuring at the highest screen resolution of our systems.
After some more experimentation, it was found to be erratic … sometimes there was no speed improvement … (we opted for a frame rate controlled application and it’s limited to a maximum of 30 frames/sec.)
From our tests, we got the following information :
GeForce2 GTS
NVidia 43.45
Windows 98
Pentium 3 550MHz
Res. Draw Time Frame Rate
640x480 10mS 30/s
800x600 10mS 30/s
1024x768 39mS 25/s
1280x1024 46mS 21/s
1600x1200 70mS 14/s
but …
1600x1200 10mS 30/s
The draw time and frame rate figures are derived directly from our software …
The tests were performed with 4 x antialising and 2 x ansitropic filtering. (Other settings were tried, which improved the performance of the mid-range resolutions but the sudden improvement at the high resolution persisted.)
As you can see, the performance drop off is as might be expected … until 1600x1200 !
At the moment, we’re completely at a loss to understand the cause of this effect … and we’re obviously keen to make this improvement 100% of the time and hopefully improve the speed of the lower resolutions also.
Has anyone go any ideas ?
Thanks
Andrew