Crashes on GF3

First off, Humus, your engine looks great!
How big are the 3D-textures you use?

The memoryleak checking in MSVC sounds cool, but I cant get it to work.

MSVC cant find any of the functions.
I have put the code Zak McKrakem posted in main.cpp.

Should I include some extra headers?

Thanks and good luck with the engine Humus!

Holy mother of all cows!!

This is sweet!

In case you haven’t seen it yet, I took 2 screenshots:

Anisotropic filtering disabled
Anisotropic filtering enabled (2x)

With anisotropic filtering disabled I’m getting between 130 and 30 FPS, but with anisotropic filtering enabled it’s very slow (almost seconds per frame instead of frames per second)

WTF! That anisotropic enabled jpeg looks well shagged!

You should see it moving

Anyway, I forgot to say that I’m using the 23.12 drivers.

I’ve heard that the 27.xx are much better so I’ll try to run it with those ones in a few minutes.

Originally posted by FXO:
[b]First off, Humus, your engine looks great!
How big are the 3D-textures you use?

The memoryleak checking in MSVC sounds cool, but I cant get it to work.

MSVC cant find any of the functions.
I have put the code Zak McKrakem posted in main.cpp.

Should I include some extra headers?

Thanks and good luck with the engine Humus![/b]

Thanks!
It still doesn’t look like intended on GF3, the diffuse lit object looks twice as bright as on my Radeon 8500. I’m not sure which driver does it the right way, but after some short testing I’ve found that my card looks much closer to what I should expect, but I must look deeper into it. After changing RGB_SCALE I’ve found that I got some texture flashing, and after looking at the screenshots from GF3 it looks like there are texture flashing there too (upper right and upper left corner on richardve’s non-aniso screenshot). Could be something wrong with my code.
The anisotropic error must be a driver bug though.

The 3d textures are 64 x 64 x 64.

About the memory leaks, I just cut’n’pasted the code into mine and it just worked, no extra headers, but it could of course be defined in some header I had included already. I was able to track the memory leak down btw, I had forgot to free a struct in my .png loading code.

[This message has been edited by Humus (edited 02-23-2002).]

Well, 27.42 doesn’t solve the problem.

The extreme brightness problem on GF3 is solved now and the demo is updated. Now it should run and look equally well on GF3 as on Radeon 8500. I’ve added coronas too, with soft in and out of visibility.

Yeah, looks way better now!

And it’s a few frames faster too (~10)

Yeah, but the performance increase is mostly because I made the light in the middle a little smaller, 800->700 , saves some fillrate, and looks a little better too.

On the texture completeness problem, this is something we have fixed recently.

Separately, I wanted to address the issue of finding bugs. A couple of you seemed to state you knew the texture completeness bug was there. In the future, I encourage you to report these issues. This particular bug was simply a cut and paste error that was quickly fixed. I posted the info on how to report issues in the “Correct VAO usage” thread.

  • Evan

Originally posted by ehart:
[b]
On the texture completeness problem, this is something we have fixed recently. - Evan

[/b]

I had this problem 8 months ago!

I believe I sent an app, but I could not find the email, so most likely I forgot

I had that problem too for quite a long time ago, or more correctly, I didn’t have any problem, but others with nVidia cards did. At that time I didn’t think of it as a driver bug, more of a “driver makes the best of the situation” thing, thus I didn’t report it, but I suppose I should have.

Otherwise, I report all issues I experience when I feel pretty sure it’s the driver and not my own code. If I’m unsure, I usually post a question at this forum instead. During this project I’ve reported several bugs and provided a number of apps illustrating the problems in question.