Hi all, I have encountered a problem that leave me standing here clueless.
The problem is while rendering lensflare quads (the usual billboarded ones) and blend them with glBlendFunc(GL_ONE, GL_ONE).
In debug and in release - all optimization modes except one - the quads get blended alright. But for the setting: Maximize speed in the Optimize drop down, the flare behaves weird.
The weirdness is that at some angles between the camera and the flare the blending mode seems to change. No flare disappears, but they get more intense when I look roughly towards them.
As I mentioned this does ONLY happen in Maximize speed option in MSVC. Has anyone encountered weirdness like this? Have anyone any suggestions to what I could do to get rid of them?
appriciate any ideas
It sometimes happen that I have to deactivate the compiler optimizations in parts of my programs (see #pragma directive).
As for any program (from MS ), the Microsoft compiler is bugged and some problems can be introduced by the optimiser…
The first time I had this problem, I spent 2 weeks hunting what I thought was my bug…
If you can play with the order of the instructions, it sometimes solves the problem !
Can’t help you more than that though…
This kind of behavior is not necessarily a compiler bug: it simply means that the compiler is simple (or stupid, if I may call it).
That means it can compile faster. It also means, you need to follow some rules to avoid this kind of behavior. I had a web site that explains this in detail but I have the addy at home.
I solved the problem temporarilly…
What I did was to set optimze option to customize, and I clicked in all optimizations. All but favor small code, and I didn’t seem to get any noticable drop in FPS. But that could also be because the renderer is not pushing its maximum right now.
But so far I am satisfied.
V-man: if you could mail me or post the adress, that would be great.
Thx for your replies