Want to try my demo?

I’ve been working on a really cool model library that reads 3ds files. When I’m done I want to give it away for free for people who are not going to make a profit on the game it’s used in.

When I’m near completion, I’ll list all of the features.

There are 890 polygons in the model.

The top left number is the FPS. The number underneith is the time(left over from a previous demo)

Press the ‘~’ for the console. The console lists a few commands you can try out.

I’d like to know the Video Card and Framerate for the ‘uv’ and ‘none’ modes.

Thanks .

Rm2 Demo 200KB


[This message has been edited by WhatEver (edited 03-12-2001).]


I have PIII 800 with 32MB Geforce 256 running Win 2K.

Here are the frame rates I got. V sync was disabled.

UV 650
Color 634
Both 606
None 670

hope thats helps.


I have a p3 600, 128MB, and a TNT2 m64 32MB, running win98.

UV 410
color 370
both 330
none 440

Oh and on my computer it was the “’” (also the “@” key and not the “~” key that enabled the console. This could be the difference in us and uk keyboards?

The program ran fine, but I couldn’t get the console to pop up at all. Tried all/most keys, with and without shift/alt gr, but no console. Got a swedish keboard here.

Anyways, dual P3 733 and Win2k - no problems.

UV 410
color 370
both 330
none 440

Woooo hoooo. Finaly a TNT2 that runs with a more than fine framerate . All of my previous attempts at drawing models on a TNT2 all had low framerates. All I did this time was align all the elements so there weren’t so many vertices being transformed. Makes sence, but the GeForce doesn’t seem to care.

Bob, my console uses the same key id uses for there console. I’m not familiar with UK KBs so I don’t know what to say . Try using the keys Tim Stirling mentioned.

Thanks for testing my demo guys. Don’t stop testing though 'cause I’d like to see the performance with a few other video cards.

NOTE: if the program doen’t start up, it means your Video Card doesn’t support the glLock and glUnlock extentions.

Originally posted by Nutty:
I have PIII 800 with 32MB Geforce 256 running Win 2K.

Thought you had a brand new Athlon 1Ghz without a gfx card and hence running in VGA mode ???

OK, I am running a Dual P3 600Mhz with an ELSA Erazor X2 (GeForce 256 DDR) under Win2K + Detonator 10.80.

UV 566
color 527
both 479
none 632

By the way, I had to use the “@” key as well !



>>Bob, my console uses the same key id uses for there console

Yes, you do, but how do you detect that key? I can get the console to work in all of id’s games, but that is not because I press the ‘~’-key. To get the console in Quake I press ‘§’, because thats the character mapped to that key (the key left of ‘1’ and below escape). The same physical key, but different characters mapped to that key.

Maybe you try to detect keys using the actual character mapped to the key. But this is not so good. To get the same physical key on all keyboards, you should use the actual keyboard scan code, the code generated by your keyboard, which the OS in turn uses to map to different characters. Scan codes are identical on all/most keyboards.

That is what I do. Here’s my code showing the scan code, not the ascii code:

case 192:

People from the UK have tested my demo before and they’ve never said they were having any problems :/. I wonder what’s wrong?


mines athlon 800mhz GeForce2 GTS

uv 1226
color 942
both 800
none 1405

I also had a problem with the console key (i’m using a UK keyboard), i had to press the ’ (Sh-’ => @) key for the console.


Yes Eric I do have an athlon… I ran it on my work machine.

still waiting for them geforce 3’s…


MSVC 6 documentation says that when a WM_KEYDOWN message is posted, wParam contains the virtual key code of the key pressed. The virtual key code is, as far as I know, the value after the device driver have decoded the scan code, using the selected keymap. So the tilde can be different physical keys on different keymaps.

lParam bit 16 to 23 contains the scan code, i.e. the physical key on the keyboard.

And the application might work on UK-keyboards, I don’t gonna argue about that. But as I said, I’m on a swedish keyboard.

Anyways, to get a tilde, I have to press AltGr + the key left of enter (the upper one of the two) followed by a space, and that does not bring up the consol in your application.

Originally posted by WhatEver:

People from the UK have tested my demo before and they’ve never said they were having any problems :/. I wonder what’s wrong?

It didn’t think there was much of a difference between US and UK keyboards.

Originally posted by mellow:

mines athlon 800mhz GeForce2 GTS

uv 1226
color 942
both 800
none 1405

I also had a problem with the console key (i’m using a UK keyboard), i had to press the ’ (Sh-’ => @) key for the console.


Those are scary framerates,

I’m glad you told me that Bob. I thought I knew quite a bit, but this goes to show you there are always little details missed.

Thanks for the benchmarks. If anybody has a 3ds loader, and if it isn’t to much trouble, I’d like you to benchmark the same model so I can see how my drawing speed differ from yours.

I’ve been studying OpenGL trying to get the best performance I could get from it, so far I’m pleased, but if one of yours blows mine away, I’ll just know there’s room for more optimization.

I’m releasing the library to the public, so we wouldn’t want a slow model drawing routine, would we ?

Just got the message
“Could not create OpenGL context”
and a crash.

Pentium 3 733, W2K SP1, TNT, Detonator 3 (650) driver from NVidia.

Doesn’t sound too good, huh ?

Well, I got stuck in the console-not-working discussion, so I forgot to report my results

Could only test the “default setting” of known reasons - 75 fps w/vsync, hehe.


None - 1615
UV - 1460
Color - 1171
Both - 1018

GeForce 2 Ultra on an Athlon 900, running 6.50 drivers on W2K Pro.

Again, English keyboard - problem with console. Pressing the ’ key (which on my keyboard is two the right of the ‘L’ key) brings down the console. Also, the fact that the keypress to bring down the console gets sent to the console kept making me mistype stuff




I’ve seen that problem before, I think it is caused by the 6.5 detonators not supporting older versions of GLUT. Try updating your glut32.dll to the one here http://www.xmission.com/~nate/glut.html

Now for the low end!

Intel Celeron 366 (5.5x66 Mhz bus - I usually run it at 5.5x83 but I’m giving it a break) with a, get this you GeForce geeks, 3dfx Velocity 100, or the equivalent of an 8 Mb Voodoo 3. The following are eyeballed averages.

none (no tex no color)=156
uv (texture)= 137
color+smooth shading = 73
color+flat shading=75
uv+color (tex + color)=66

I’ve no problems with the tilde key (~). Nice console.

On another note, your program has left me in a 640x480 desktop and not the 1024x768 desktop I started with.

I’ve noticed that none of NeHe’s tutorials would properly preserve my desktop resolution either. Not even when I tried the “fullscreen fix” tutorial. If anyone could shed some light on this I’d be most appreciative.

Nice demo! A 3ds library would be most useful. Does it have display functions, like GLshow3DS() or does it just read them for you and convert to a list of tris?

Heaven (heaven@knology.net)

GAH! Bugs

I am using glut alright. I use it for the mipmapping and setting up the perspective.

I have a way of getting the original res back, but I’ll have to put it here after work 'cause I’m low on time.

Thanks for the compliment on my console . I’m using raster fonts right now; they’re beyond slow so I’ll convert it to textures in the near future.

What my library does is read the 3ds file and place all the contents usefull to a game model into a ready to use class. All you really need to do to open a 3ds file and then draw is this:

rm_model Rm2Model;

//ignoring the return value for simplicity

//some other thing you can do
//set one object to draw an envirnmental map
Rm2Model.SetEnv(“objectname”, true);
//to draw a model with envmap
//I prolly could have overloaded the func :stuck_out_tongue:
Rm2Model.DrawEnv(TextureNames, EnvMapName);